...turning the voice-of-the-customer
into management actions

Home >> Buzz, blog and dialogue >> Picking the metrics that matter for local government customer experience

Buzz, blog and dialogue

Satisfaction measurement and voice-of-the-customer

Metrics that Matter for Local Government Customer Experience  

Picking the metrics that matter for local government customer experience

At a time when local government is searching for ways to reduce costs and improve public outcomes, a snapshot study of New Zealand local councils reveals three key customer experience indicators that differentiate council performance and quantify the potential for improvements.

Peter Drucker is often quoted as having said “What gets measured gets done”. Four hundred years earlier, the Austrian mathematician, Georg Joachim Rheticus, made the observation that if you can measure it, you can manage it.

Ever since, having a good set of metrics to assess performance and manage improvement has been considered an essential business practice. Metrics drive behaviour, behaviour drives performance and performance drives outcomes.

Wrong metrics – drive wrong outcomes

The problem is, if you choose the wrong metrics, it will often drive the wrong behaviour, and you’ll achieve the wrong outcomes.

For example, in the absence of good customer experience metrics, many local councils use their Request for Service (RFS) systems to monitor the time taken to “close out” service requests compared to a target “service level”. Exception reports are generated and overdue requests are listed and expedited. All good practice, but it’s also far too common for requests to be closed prematurely before they are fully actioned, so departments, individuals and sub-contractors can escape the scrutiny of the much-feared “overdue RFS report”. When customers are later surveyed they reveal that many requests remain unactioned.

Hence even a good metric, used badly (or in isolation), often drives the wrong behaviour and results in poor outcomes.

Metrics that matter – drive continuous improvement

Our work with local government has revealed three key metrics that, applied to local councils’ “request for service” processes, can differentiate council performance and quantify the potential for saving costs and improving council and public outcomes.

The three key metrics are:

Average number of contacts (interaction efficiency metric)

Average Number of ContactsThe first key metric has a direct impact on council resources and the cost of managing and responding to customer requests for service. It’s a measure of the number of times customers need to make contact with the council before the request is actioned.

If customers need to contact the council more than once to have a request actioned, it has a costly impact on council efficiency, and a negative impact on customer satisfaction and community advocacy and support.

For example, if the average number of contacts for a council is 1.2, this indicates at least a 20% additional burden on its contact centre and customer interaction processes, and many additional costs and distractions resulting from calls escalating through the council to other departments, managers and elected representatives.

Customer perceived completion (process effectiveness metric)

Customer perceived completionExpressed as a percentage of closed cases, this key metric tracks where customers believe that the response and actions promised by the council, have not yet been completed.

Customer surveys are conducted shortly after each case has been closed, to ensure that the requested service has been completed before the customer is asked for their feedback.

Customer Satisfaction Index (customer outcomes metric)

Customer Satisfaction IndexThe Customer Satisfaction Index (CSI) is the principal top-level metric summarising the customer experience outcomes from the RFS process.

The CSI is calculated from the aggregated index of three specific performance outcomes:

Customer advocacy is measured in terms of the likelihood they would recommend a friend or neighbour contact the council for assistance if they experience a similar problem or concern.

Poor scores in these three key metrics often indicate:

  1. Customer expectations are not being correctly set
  2. Customers not being kept informed of progress
  3. Poor case management practices and weak response processes

Turning metrics into management actions

When you further disaggregate these scores and examine the performance of each department, request type and individual, it can reveal a multitude of “customer experience champions” and “service saboteurs”.

Councils that combine these three key performance metrics with a diagnostic approach to measuring and managing customer experience can obtain incremental and measurable improvements in their interaction efficiency, process effectiveness and customer and council outcomes.

This diagnostic approach requires two important elements:

 

This approach helps councils identify council-wide and individual opportunities for improvement, and direct their remedial action plans to focus on cost-effective activities that make the greatest improvement to customer and council outcomes.

Councils participating in our snapshot studies have been able to gain department-level insight into which service types perform well and which offered the greatest opportunities for improvement. Councils participating in our ongoing measurement programmes receive monthly reports that provide feedback by individual request channel, receiving officer, request type, action officer and subcontractor.

© CTMA New Zealand Ltd.


Study reveals scope of untapped improvement opportunities and cost savings for local councils.


To see how well New Zealand councils score in our working paper, request a copy to learn more.

CTMA working paper

Request a copy

For more...

For more information about the CTMA approach to turning the voice-of-the-customer into management actions, join our reading room, or if we can help you more directly - please get in touch, and we can schedule an introductory discussion.
Schedule a call



View more posts

 

About the author: Paul Linnell

Paul Linnell Paul Linnell is a customer experience and service quality improvement champion, working internationally with senior managers and their teams to help them achieve business success, reduce risk and build customer loyalty and advocacy by taking actions to improve customer experiences.  Paul specialises in the design and deployment of customer experience measurement, service quality improvement, complaints handling and preventive analysis programmes. Industries he has worked with include, Automotive, Consumer electronics, Consumer goods, Electricity & Gas retail, Financial services, Information technology, Local Government and Public Sector, Media / Publishing, Passenger travel (Rail, Air and Sea), Pharmaceuticals and Telecommunication. Paul has worked with clients and presented on these subject at conferences and corporate events in the UK, Europe, North America, the Middle East, Australia and New Zealand.  Originally from the UK and now based in New Zealand, he continues to serve clients globally.

View Paul Linnell's profile on LinkedIn Google+