Customer experience measurement: You can't manage what you don't measureby
Mark Gentry examines the need for customer experience programmes to start delivering real value for clients. In this best practice feature, he outlines an approach based on greater measurement which can help to deliver a more effective customer strategy.
- Strategic healthcheck
- Relationship monitoring: provide a regular 'healthcheck' on the level of positive behaviour, feelings and opinions among the customer base as a whole.
- Priority setting: identify which 'touchpoints' have the greatest ability to influence the customer’s perceptions of the relationship, both in terms of their 'reach' within the customer base and the level of impact that they have for individual customers.
- Tactical-level diagnostics
- Performance monitoring: provide on-going measurement of customers’ opinions of service performance in those 'touchpoints' and 'moments of truth' in the customer lifecycle that are shown to have greatest impact at a strategic level.
- Understand customer needs: generate 'rich' qualitative insight into what good service looks like to customers, to provide a model to work to.
- Targeted measurement: compare performance between specific customer groups, channels, teams etc. to identify where action is most urgently needed.
- Staying with you.
- Spending more with you.
- Advocacy of your business to others.
- Compliance with your requests.
- Support for you in broader community issues.
- The link between customer advocacy and business performance is not fully understood and its validity as a predictor of business performance on its own has been questioned by some studies – advocacy may not equate to success for all businesses.
- The 0-10 rating scale is not always used consistently by customers eg we have seen many cases where, based on subsequent comments, it is apparent that some customers regard 6, or even 5, as a high score, though these customers would be classed as 'detractors' in the NPS calculation.
- The way NPS is calculated makes it quite a volatile statistic, prone to quite extreme fluctuations.
- Select competitors that provide valid and useful comparisons – there is no point spending money on comparing your performance with companies that are fundamentally different or where you cannot identify why differences appear.
- Although attractive, it can be very costly to reach customers of high quality 'niche' competitors in sufficient volume to give you robust data.
- Take into account the difference in the profile of competitors’ customers when analysing results – there may be a natural tendency for your results to look worse on paper if you have a more demanding customer base.
- Take into account the different way that competitors deal with service to their customers – your customers may have a substantially different set of experiences dealing with you than they would have dealing with a competitor, and some comparisons may not be comparing like with like.
- Qualitative research with customers to establish their overall views on service, identify 'moments of truth' and their likely behaviour in response to good or bad service.
- Stakeholder interviews to gather the views of key internal personnel.
- Benchmark quantitative research at strategic level.
- Key driver analysis to establish priorities - evaluate which experiences have the greatest positive/negative impact on customer loyalty.
- Establish event-driven tracking.
- Tracking waves of strategic research.
- Further qualitative research into specific events if needed to give greater depth of insight into customer experiences.
- Conduct local workshops with staff responsible for delivering key customer experiences.
Finally, reporting and usage of the data is key to maximising the return on the investment
- Senior management – face-to-face presentations, dashboard metrics.
- Channel/team management – presentations, specific metrics, sub-level comparisons.
- Tactical level (team, etc) – access to tracking data on the areas they are responsible for, targets, action plans, customer comments, 'red flag' customer issues highlighted by the research.
- Branch level – rolling data on good/poor performing branches to monitor maintenance and improvement activity.
Mark Gentry is research manager at McCallum Layton.