Four ways to design customer surveys for action, not just insight
Four tips to build better surveys to give you results that represent what customers truly think, which gives your team the insights they need to act.
You have questions that need answers. Getting those answers from customers usually requires creating a survey to gather customer feedback.
The good news is that surveys seem pretty straightforward, right? Ask a few questions, get the answers you need, and then you’re off to the races making changes and finding new insights. Unfortunately, getting great actionable data is harder than it looks. In a 2018 report, Dun & Bradstreet and Forrester Research found that only 43% of individuals polled believe their companies’ “data sources and insights are well integrated, understood, consistent, validated, and shared across the organisation.” Furthermore, only 42% of those surveyed think their company is capable of converting the gathered data into insights.
Oftentimes this gap from data to insight occurs due to an issue right at the beginning of the process: the survey design itself. If your survey isn’t properly designed, you might end up with very few responses - or worse, misleading data that negatively influences your business decisions. Very few of us working in customer experience have a statistics degree. So, it’s understandable that we’re mostly winging it when it comes to creating statistically viable surveys. That’s why we’re here.
In this article, we cover four tips to build better surveys to give you results that represent what customers truly think, which gives your team the insights they need to act. Follow these tips and your team will be making top-notch surveys in no time.
1. Understand what you are trying to measure
The business objectives you are trying to achieve will dictate the type of survey you create. When the desired outcome is unclear, surveys become unfocused. Even though the survey may generate a lot of data, it won’t be helpful in making data-driven business decisions.
Before creating a new survey, decide what business problem you are trying to solve. Are you looking to validate your product roadmap? Streamline the customer onboarding journey? Improve the customer service experience? All of these surveys will look very different.
Once you’ve decided your overall survey objective, determine the top-line metric you’d like to base your survey on. These will be the structured backbone of your survey. Common top-line metrics include questions like customer satisfaction, Net Promoter Score and Customer Effort Score. This anchor metric will help position any qualitative feedback you receive through text based responses. Beware though, because as McKinsey reports, “no one metric is the best for all businesses... and best-in-class operators generally choose the metric that is most predictive of their desired business outcome, which can vary by industry.”
Secondly, determine the touchpoints that are appropriate for the measurement you’re trying to make. Asking a website visitor who is just browsing your website how they feel about your company won’t help you decrease churn - they might not even have a real opinion yet. Comparatively, running an NPS survey with customers who’ve just completed their onboarding period would be the perfect place to get feedback about the onboarding journey.
2. Structured and unstructured questions
We all know the old adage, “it’s not what you say, it’s how you say it.” This is very true in survey questions. How you ask is almost as important as what you ask. Getting someone to open your survey, let alone fill it out, is a huge undertaking of its own. So, the last thing you want to do is frustrate or confuse them with overly complicated questions.
Different types of survey questions are useful for different types of data. There are two main categories of questions: structured and unstructured.
Structured questions are close-ended (ie. respondents can only choose from a predetermined set of answers), which means they require less cognitive load. Because they are easier to answer, structured questions can be used to gather a ton of valuable information that is also easy to analyse. They result in more accurate data, so the majority of your survey questions should be close-ended. For example, structured questions include the NPS question, demographic data collection and satisfaction questions.
Unstructured questions are open-ended and allow for respondents to provide their opinion in free-flowing text. Open-ended questions require more time to read, analyse, and consider. However, unstructured data can be extremely powerful when analysed through text analytics software that takes advantage of natural language processing.
Industry insightsView more
When your survey is designed properly, you’ll gather both clear, accurate data from your close-ended questions alongside rich, potentially unexpected, feedback in the form of unstructured data. Together, they drive action.
3. Let the metric dictate the method
Depending on the metric you’ve decided to capture, the touchpoint and medium you use will change. The touchpoint is when you choose to send the survey, perhaps aligned with actions that the customer has taken, and the medium is the format or the channel you use. Together, they can make a big impact on the amount and value of data you receive.
- The touchpoint: depending on the business outcomes you identified in part one, the timing of your survey will vary. Each point of the customer journey will offer different insights and opportunities.
- The medium: whether it’s in-app, email, SMS, direct call or even good ol’ fashioned pen and paper, how you deliver the survey needs to align with the questions you’re asking and the point at which you’re asking them.
Let’s put the two of these together in some common survey scenarios:
Once you’ve cemented your desired business outcome, chosen a top-line metric and identified the appropriate touchpoints for your surveys, it’s time to ensure your survey is well designed. Even if the theory behind your survey is sound, a cumbersome design will deter customers from giving their feedback and limit your results.
4. Design the survey with the user experience in mind
- Keep it short and sweet - If you exceed five minutes to complete the survey, or go over 12 questions, your response rate on average will drop 17%. If your survey takes more than 10 minutes to complete, it results in a 40% lower response rate on average.
- Limit options to the essentials - it’s common to want to cover all bases when you write a survey question. For example, you may want to know what type of pet someone owns. You could make a list of all the different species of animal someone could have, but, in the end, it’s not that practical because most people fall into just a few different boxes for pets. If you want to allow more flexibility, offer a type-in “other” option for the outliers.
- Ask one question at a time - Avoid double-barrelled questions where you ask two questions at once, but pose it as one question. An example is, “how far would you drive for dinner and a movie?” That may seem like a fine question, but, in reality, the respondent may drive different distances for both those things, so it’s best to ask each separately so you get more accurate data.
- Avoiding leading questions - Bias is a reality of surveys. Have another pair of eyes look over the survey and pay specific attention to how questions are framed. Are they neutral or do they lead the respondent to provide a certain answer? For example, the question, “How awesome a job did Jon do?” may lead the customer to a more positive answer since you’re already suggesting Jon did an awesome job with the way the question is worded. It may make your results looks better, but they’ll be inaccurate. A better way to write the question is, “Were you satisfied with the service you received?”
You want to be thorough when you create your survey. Making sure you get all the information you want is important, but it’s also important to make sure you don’t needlessly ask questions. Seeing multiple versions of the same question can be fatiguing, or annoying, to your respondents, so do your best to avoid it.
Here are two things to ask yourself when adding questions:
- What will we do with this information? If you don’t have an immediate answer for how you will action the data, then it’s not a necessary question.
- Can we get this information another way? For example, if you already have the user’s email address, can you pull up their other account details or previous survey results to fill in the same answer?
We all want high-quality data. Customer insights guide company decisions and help us move forward. Surveys are a great way to gain those insights, so it’s important that we take care when creating them. Be sure you’re asking the right questions and considering your respondent at every point. If you do, your survey will be a success.