Why relying solely on NPS could be misleading
Providing top-notch customer service is a top priority these days – it ensures they leave happy, are more likely to return and recommend to others. This is why businesses have for so long, relied heavily on Net Promoter Score (NPS) data to gauge customer satisfaction – but as a metric, it does have its drawbacks. NPS gives insight into general customer satisfaction, i.e. likelihood to recommend, but it cannot be used to guide a business to the specific areas that need focus in order to improve.
There’s no doubt that NPS has value or so many wouldn’t still be using it today. Indeed it can be used to predict customer lifetime value, price sensitivity, willingness to try new products and a range of other metrics associated with loyalty. Therefore, when you see that you’ve done something to change your NPS score, you can estimate that change’s impact on financial outcomes. It’s also valuable because it is easy to benchmark to your direct competitors on an ongoing basis.
However, it does have a number of limitations - mostly driven by the low sample size associated with receipt and email surveys. Specifically, NPS responses often suffer from selection bias. Typically, people who agree to take these types of surveys are either your most loyal customers or folks that are highly disgruntled. Therefore, NPS scores likely overweigh both local customers and detractors.
Additionally, NPS only gives a brand-level view so it can’t be used effectively to improve customer satisfaction at the store level. Some stores may be much more effective at driving likelihood to recommend than others, but you won’t see that with the low sample size that receipt surveys gather. A brand-level view is helpful but difficult to act on — you can’t confidently see what store locations are examples of best practices, or which need more attention.
And herein lies the problem: NPS doesn’t paint an accurate picture of what is happening at the store level. The customer feedback that NPS can gather is most valuable when you can determine how to convert a detractor to a raving fan. But with the absence of mass representative data - polled on multiple metrics across each location - it’s impossible to know why a detractor is a detractor and therefore it’s difficult to act on an NPS alone.
Businesses need to find ways to collect more store-level data and ask multiple questions, in aggregate, in a low-touch, low-friction way. For example, one of the most fundamental but revealing questions a store can ask is “How was your service today?” Many different things can impact service scores, but mainly this comes down to staff: Is there enough on shift? Have they been equipped with the right knowledge and training to help? And is management leading them effectively? Service can also be impacted greatly by a manager that inspires and a team that is motivated. In many cases when service scores have dropped, it’s because of the manager. More training, a new manager or even just a brief check-in can often inspire improved results.
It's also really useful to ask a question about a customer’s experience. This is a very hot metric right now, as it can be a challenge for businesses to understand consumer expectations against actual in-store experience. The “experience” question also is perfect for testing things such as new store layouts, a different menu or new checkout experiences before investing in those changes.
To sum up....
Businesses must bear in mind that NPS only gives a narrow view of your customers and the data isn’t representative of your entire customer base. Better understanding of what’s driving customers and impacting revenue is needed by utilising simple, low-friction ways to collect genuine and mass customer feedback.
You might also be interested in
I was formerly a consumer rights lawyer at Europe’s largest consumers’ association, Which?, before founding TruRating in 2013 when I began to notice how influential online review sites were becoming and the make or break role they were playing for many businesses.
I was concerned that despite best intentions, feedback sites often just...