Customer survey
istock

How to conduct customer surveys that engage customers and enlighten you

by

Customer surveys are more important than ever as consumer preferences and behaviours change. But survey responses have dropped. Here are some survey design best practices to help. 

10th Aug 2021

Customer surveys are a crucial source of insight for organisations, so that they can learn about the experiences they are delivering, the customer satisfaction levels they are achieving, and the preferences of their customers. However, even though surveys can translate into improved experiences, products and services, customer enthusiasm for them has waned.  

Part of the problem is that customers have found themselves being bombarded by rising numbers of survey requests in recent years. And so it was little surprise that when OpinionLab conducted research into the impact of over-surveying, nearly three-quarters (72%) of respondents said that they found that it interfered with their online experiences, while 80% said that they often abandon surveys before even reaching half-way.

The pandemic, meanwhile, has added an additional complication. Clearly it is a time of enormous upheaval and customer behaviours and requirements are shifting rapidly. But at the same time, is it appropriate to solicit survey feedback at such a sensitive time? Heart of the Customer's Jim Tincher debated this very point, and what is clear is that the nature and tone of the survey is absolutely crucial.

All of this demonstrates that survey design is becoming more important than ever. So what best practices should you adhere to whilst developing your own customer surveys?

Ask why

A customer behaviour specialist and chief development officer at Yonder Consulting, Dr. Tom Wormald says that asking ‘why’ can often be the hardest part of the overall process, as it involves getting to the root of what drove a customer to your brand to begin with:

“Often what we don’t know is the why. We’re able to measure and understand that, say, ‘on the telephone we perform this way’ and ‘online we perform that way’ and so on and so forth. But what we don’t know is why.

“60-70% of a customer’s journey has already happened before they reach a first touchpoint…all of the things that have caused them to come to you to begin with are kind of invisible and difficult to pick up in any typical survey environment. Therefore, when you’re trying to establish what’s driving the likelihood of a customer to recommend or be satisfied, you need to ensure you’ve done as much analysis as is possible to establish what’s got them to the point where you want to survey them.”

Be relevant and timely

Increasingly, the relevance and point of contact can be a key determinate for whether customers decide to provide feedback via a survey you’ve put out.

A key example of this is highlighted in Interaction Metrics’ Point-of-Purchase Survey Study, which examined the surveying methods of top US retailers.

“Walmart asked four introductory questions irrelevant to the customer’s experience, and required the input of two receipt codes. Really? That’s a hassle,” Martha Brooke, chief customer experience analyst and founder of Interaction Metrics told Customer Think, in discussing the report’s results.

“A survey should certainly never take longer than the interaction itself; in fact, it should take less time. Family Dollar asked a whopping 69 questions in their survey—with ten seconds a question that’s over ten minutes spent reflecting on items that cost a buck.”

Engagement expert James Bolle believes many more surveys should lean on easily accessible data to better guide a business about their point of delivery:

“If a survey is given to a customer at a time where they’ve had a high level of connection, they’ve had an important touchpoint, an experience that matters to them and is respectful of their time and allows them to tell their story and what’s important to them, then that’s a relevant survey.

“If you’re halfway through a meal it’s not much use getting an email asking you how your meal was because you haven’t finished it. You have to send surveys at the right time and in a way that means customers are happy to converse with a brand.

“We’ve seen a lot of success with brands that are integrating feedback into their mobile apps, or issuing digital invitations in a much more sensitive way – so, at a point where you know someone has been in your store, called your contact centre or been on your website, you’re able to deliver them a personalised opportunity to provide feedback.” 

Only ask things you don’t know

This is a common best practice that many brands fail to adhere to. Maurice Fitzgerald, an author of multiple books on customer-centric strategy, believes that, in the same vain as with relevance and timeliness, the key is in the data you already hold.      

“You will use your customer database to determine who to contact by phone or email. You already know who they are, so don’t ask them their names.

“You already know what company they work for and what size it is, so don’t ask. Use the information you have to demonstrate that you remember the customer. This is less common than you might expect. I have even been asked for my email address in a survey sent to me by email.”

And Bolle says brands can extend this beyond the data they hold themselves, to third-party data they have easy access to.

“It’s not just about not asking questions you already have the answer to, it’s also about not asking questions you should know the answer to.

“For instance, our chief scientist bought a new car. He was sent a 10-page survey and at the top it had his licence plate number and the very first question was ‘what sort of car have you bought?’ Clearly that’s information that this brand should know. Don’t waste your customers’ time.”

Get the number right

Data from SurveyMonkey highlights the drop-off rate associated with asking too many questions:  

Survey drop-off

Whilst it may stand to reason that longer surveys are likely to create some form of attrition, Sarah Frazier of Customer Gauge highlights that when it comes to customer satisfaction surveys, a clear correlation exists between brevity and future customer retention:

“Voice of Customer (VoC) surveys are used to continuously take the temperature of customer relationships. These types of surveys should be conducted at least every quarter to ensure that customer issues don’t accumulate and remain unsolved and to stop customers from detracting. VoC and NPS surveys should be short, as they are not trying to “measure everything at once” when it comes to customer relationships, but to get as much feedback as possible with clear answers.

“According to our research, VoC (including Net Promoter Score (NPS) surveys improve retention the most when they have 6 questions or less, as shown in the figure below.”   

Survey design 2

“Designing a quality customer satisfaction survey is a process, requiring multiple edits to reach the best version,” says Martha Brooke.

“Throwing in every question is how NOT to design a survey. Think about what you want to know, and carefully craft your questions.”

Get the questions right

When it comes to question-setting, numerous studies exist about the pros, cons and benefits of fielding different types of questions. However, having adhered to some of the earlier best practices, most advice points to designing surveys that blend the right mix of qualitative and quantitative questions.      

“It’s always best to start a survey with a broad question rating an experience, and then an open question in which the customer can tell their story,” says James Bolle.

“Try to keep the whole thing short – it may be touching thousands of people so it needs to use the right language, adheres with your overall brand experience. Make sure it looks nice, is interactive and is fun to do.

“Lots of brands want to use NPS because it’s the one number you need to grow, however there are other measures that tap into the emotional reaction to an experience much better. When we survey customers about the emotions they experience, one of the top things that comes out is that they feel satisfied. Questions about satisfaction have fallen out of fashion somewhat in recent times but they are still relevant in tapping into the emotional core of any given experience, and that’s ultimately what you want to be measuring.”

Survey experts, Bsquared suggest that for online, telephone and even face-to-face B2B surveys, the following questions often glean the best results in satisfaction surveys:

  • How do you rate your overall experience with us?
  • Would you continue to use us even if you had the possibility of using an alternative provider?
  • Would you recommend us? Here you can use the Net Promoter Score (NPS)
  • How could we improve our service to you?
  • Future plans - is there something specific?
  • What challenges will you face in the future?
  • How could we further support your business?

Avoid bias

Bias is a common issue for most surveys. As Tom Wormald quotes explains, this can often come from both the customer and the survey itself.

“Firstly there’s bias from the customer, because often an experience with a brand is a sort of nebulous collection of complex interactions that’s taken but then being asked to be packaged up and summarised in a very neat and concise manner, for the sake of a survey. You almost codify what you think has happened purely because you’re being asked to fill in a survey.

“Then there’s an organisational bias. You often hear what you’re listening for – you ask a set of questions that covers off the things you want to hear, when actually you may not have focused on the things that customers want to tell you.” 

To avoid unnecessary bias from both sides, it’s key to offer open questions that allow customers to explain their own individual experience with your brand.

When you’re trying to establish what’s driving the likelihood of a customer to recommend or be satisfied, you need to ensure you’ve done as much analysis as is possible to establish what’s got them to the point where you want to survey them.

Martha Brookes suggests it’s important to avoid leading questions, such as ‘How satisfied were you with the speed of our checkout?’ or multiple questions in one, as with this example from US retailer, Lowe’s  ‘Were you consistently greeted and acknowledged in a genuine and friendly manner by Lowe’s associates throughout the store?’.

“Imagine Lowe’s finds that 85% of customers say “No,” they were not consistently greeted/acknowledged in a genuine/friendly manner,” says Brookes.

“Obviously they need to make improvements—but what? Their greetings or their acknowledgements? How friendly they are or how genuine they are? The best survey questions provide clear and actionable insights."

Indeed, this last point is most pertinent. Whilst designing engaging surveys may feel like a lesson in stripping away anything that's deemed irrelevant to you and your customers, the core aim should always be about gleaning information that's wholly relevant - the truly actionable insight.  

Replies (0)

Please login or register to join the discussion.

There are currently no replies, be the first to post a reply.