
Can sinking customer survey response rates ever be revived?
byCustomer survey response rates have always been a challenge, but now more than ever we need to focus on what can be done to reverse the downward trend.
Response rates have always been a challenge for the insight community but now more than ever we need to focus on what can be done to reverse the downward trend that’s been witnessed since the late 1990s and which has, in some cases, been exacerbated by the pandemic.
What are some of the reasons for the decline in response rates? For starters, people are deluged with surveys and feedback requests. When we consider that their time is at a premium, day-to-day challenges are prioritised. Adding to this is the fact that the ROI for respondents is not always clear. To understand this point, put yourself in the customer’s shoes – if I spend time providing feedback, do I believe that anybody will listen? Will they act?
However, achieving statistically valid response volumes is vital if we want to be sure that the information received via surveys is unbiased and reflects the sentiment of the population or customer segment that we are trying to reach.
We all know that there is no such thing as a perfect sample or a perfect response rate. Ranging widely from 5% to 50%, response rates can be driven by a variety of factors, including business focus (B2B vs. B2C), account management model (high touch vs. low touch), programme type (relationship vs. transactional).
Generally speaking, customer survey response rates are higher than general sample Market Research rates, but lower than employee/partner/internal response rates. Heavily solicited target respondents may suffer overall survey-invitation fatigue are going to be harder to hear from. A B2B customer of several years who interacts with your company several times a week may be more willing to respond than an infrequent consumer in a quick transaction.
Given that the overall objective for CX professionals is to use customer feedback to drive lasting, positive business change, what can be done to increase response rates that you can rely on to provide accurate insight?
The following practical guidelines will hopefully prove useful.
- Clean your database first: Data hygiene is essential before launching a survey. It should not be seen as a mundane, low-priority task. Instead maintaining the customer database as a key strategic business asset should be regarded as a vital, ongoing discipline. As a part of the data cleansing exercise, segment contacts by role (so that the right questions can be aligned to each area of responsibility).
- Engage in better pre-survey and post-invitation communication: Pre-survey outreach to customers often yields a better survey response. This can be done using a variety of methods – email, website banners, sales team communication, etc. The content should express why you are doing the survey, what you will do with the data and, if you have done prior waves, what you have changed as a result of the feedback provided by customers. After you send the invitations, front-line reps should follow-up with non-responders to personally encourage participation; it pains me to have to say this, but please be mindful that these appeals do not include score-begging - that is, the practice of saying something like “please give me a 10.” If necessary, provide scripts for your teams to follow to ensure a consistent, appropriate message is communicated.
- Don’t be afraid to test approaches: A/B testing on invitation content can help you to hone in on which messages resonate best with customers and lead to better response rates. This is particularly important if analysis suggests that open rates on email invitations are an issue.
- Don’t just ‘spray and pray’: Launching the invitations and waiting for the surveys to roll in can be a recipe for disappointment. Rather, spend time (particularly at the start of the initiative) to monitor the delivery, open, click-through and completion rates. Getting an early warning of potential issues (and the corrective actions needed) can prevent problems later in the process.
- Watch your language: Avoid words like ‘survey’, ‘questionnaire’ and other research-oriented language. These will often be intercepted by spam detectors. They also tend to be impersonal and do not encourage customer participation.
- Start the survey in the invitation: Nesting the first question of the survey in the body of the invitation can encourage the start of the survey straight away. Moreover, it can help ensure that the most important question is answered first and foremost.
- Keep it short: The trend is toward targeted, short surveys. For each question in a survey, ask yourself “What do we want to learn, what will we do once we learn it, and who will own it?” If you can’t answer all these questions, re-consider including the question.
- Don’t ask questions you should already know the answer to: Avoid asking customers to provide information you should know – for example, the last time they had contact with you, products they have, etc. This not only takes more of the customer’s time, but it also sends a message that you don’t know the customer and/or don’t value their time.
- Manage how often you survey: Create business rules that govern how often customers can be surveyed. By actively managing the cadence of surveys to individuals, you will minimise the over-surveying burden that is increasingly common in the marketplace.
- Thank the customer for their participation: After the survey, thank the customer for their time and feedback. Include in this message a recap of what you plan to do with the information. Where possible avoid an impersonal approach that looks like a canned auto-response.
- Close the loop: Building in a process for employees to close the loop customers will ensure that everyone concerned knows how seriously the company views customer feedback. It will provide evidence that there is an ROI on customers’ investment of their time.
- Act on the insight: The importance of this should not be underestimated. Making changes that improve customer experience will not only be good for business but will also help increase response rates over the long run.
- Engage in better post-survey communication: Without question, it’s best practice to communicate what was learned in the programme and what initiatives you have underway as a result. A good example of how to do this is to create a website where customers can see the status of various initiatives. This level of transparency can help to increase participation.
Above all, remember it's a marathon, not a sprint! Managing response rates is a job that is never complete so consider building an ongoing examination and diagnosis of response rates into the Voice of the Customer (VoC) workflow. Doing so will not only help you to maximise response but will also provide insight into other areas of the VoC programme and CX processes that need attention.
Replies (6)
Please login or register to join the discussion.
This comment posted in the MyCustomer LinkedIn group by member Sven Esser:
Customer feedback is very important. The question if customer surveys are still the way to go? Everyone of us will bee flooded with information and by nature this let us to reject surveys. However, I think today we have many other possibilities to get the information we need from a customer. I think serveys are just a small / nice add-on ad not the thing to focus on.
This comment posted in the MyCustomer LinkedIn group by member Peter Dorrington:
Mark gives some good advice about survey design, but there remain some fundamental challenges: so many customers are over-exposed to the various satisaction survey questions about the vendor (how well did we do?) vs. the customer (how well did you do?), surveys that are too long or granular as they overwork the few respondents they do get to gain an actionable insight, or to difficult to give a meaningful response (e.g. need an essay as an answer). They are also often asked at the wrong time - how likely are you to recommend us (when we've not even dispatched your order yet)?
For me, too many customer surveys are conducted in order to standardise/categorise answers and come up with a score that reflects industrial-era management, than service-era leadership measures of success.
Hi, Peter - I appreciate your comments. Some of what you describe are good examples of bad design - for example, getting the survey before the order is complete. When I see these types of examples, I often wonder if there is an objective for the data beyond getting a number.
The best practice, of course, would be to start with a specific business question or objective we are trying to address and to make sure that any questions asked are designed to prompt some kind of action. If we stay true to that, we will be more likely to streamline the number of questions we ask (and, I find, the questions will often be much more relevant to the customer and/or customer-focused).
Thanks, Sven, for your comments. Like you, I believe that surveys can be one component, but they are not the only component. In fact, the ability to pull in other data (financial and operational, as well as data such as chat and online reviews) has enabled us to be much more focused on the kinds (and amount) of questions we ask as well as the analytics that can be conducted on the data. Regardless, I do feel that surveys continue to have an important role to play.
Unfortunately, many organizations continue to work in information silos; this means the well-intentioned desire to gather customer feedback often results in customers getting numerous requests for surveys from different parts of the same organization. Many of these surveys have the same (or similar) questions and may not align with the brand image the organization wants to maintain. These send a signal to the customer that there is little likelihood that the data will be acted upon, which results in the customer opting out. One answer to that is to use a consolidated platform/approach with a sound governance policy to ensure that customers are not over-surveyed. That's a bit of a different topic, but it is related to (and can influence) response rates.
User feedback is very important, thank you Mark for sharing the suggestions with us.
Thanks, Calceus - glad you liked it.