OK, I’ll admit right off the bat that this headline’s more than a little provocative. We do care about Service Level Agreements (SLAs). In fact, we work hard to maintain our industry busting average of 85% of SLA targets met. It’s just that we also happen to believe - passionately - that meeting SLA targets is a pretty low benchmark for quality of service. Allow me to explain.
Why meeting SLA targets doesn’t always equal a happy customer
In the context of managed IT services, an SLA generally covers response times (how long it takes for the service provider to acknowledge a customer’s issue and begin working on it) and resolution times (how long it takes for the service provider to solve the issue), across a range of areas. Each issue is delegated a priority ranking, with corresponding agreed response and resolution times.
The theory goes that if the service provider responds to, and resolves, the customer’s issue within those agreed timeframes, customer satisfaction is guaranteed. Job done.
But as technology becomes ever more complex, and businesses and their employees need to be ‘always on’ more than ever, simply hitting response and resolution targets is no guarantee of customer satisfaction. In fact, many’s the time in my career where I’ve come across a customer who’s had their SLA targets consistently met by their provider, but is still very unhappy with the service they’re receiving.
Beyond the SLA: What counts as a job well done at Origin
Response and resolution times are totally objective and easily measured, hence their inclusion in SLAs. But there are ways to measure quality of service that go far beyond the level of commitment we’re contractually required to meet.
At the individual level, we measure customer satisfaction by way of a customer satisfaction score, or CSAT. Every single time we engage with an Origin customer, we ask them to let us know how satisfied they were with the interaction. On average, 94.7% of end users who provide feedback rate their experience as positive; well above the industry average.
At the account level, we use a Net Promoter Score, or NPS, which measures our customers’ willingness to recommend our services to others. The NPS serves as a proxy for gauging our customers’ overall satisfaction with our service. Finally, at the project level, we utilise Post Implementation Reviews to evaluate whether project objectives were met, how effectively the project was run and whether there were any lessons to be gleaned from the experience.
High scores on these three measures, in addition to hitting our SLA targets, is a truly accurate indication that we’re doing a great job by our customers.
SLAs don’t drive continuous improvement
There’s another reason for going beyond SLA metrics. In and of themselves, they tell us little about how we can continue to improve our service.
On the other hand, every single time we receive a negative CSAT score, we pick up the phone and call our customer so as to better understand their experience and what we could have done better. We use their feedback to create and implement an ‘improvement plan’, enabling us to continually improve our service to the benefit of that customer and our wider customer base.
At Origin, our commitment to customer service goes far beyond the SLA. By giving our customers a voice, we can continue to meet, and exceed, their evolving expectations.