If the response rates to your B2B customer surveys have dropped the past couple of years you aren’t alone. The trend spans verticals, product types, decision makers and end-users. Multiple factors contribute to the decline in responses: Spam filters have gotten stricter; DIY survey tools and the near ubiquitous NPS programs overload customers with surveys; and, poorly designed surveys create a poor experience for customers.
So what can you do?
While the overall drop in response rates is inevitable, you can take steps to slow the decline in response rate in your own customer surveys.
Manage the Number of Surveys
Limit the number of survey requests to individual customers to 2-4 per year. This requires controls over survey distribution similar to the management of marketing emails. If different departments have access to DIY surveys tools, customers can receive surveys from customer service, marketing, product management, etc.
Customers need a reason to select your survey over the competing invitations in their inbox – they aren’t going to give their time to everyone that wants their feedback. Third party research firms provide financial incentives to encourage participation. What can you provide? Can you offer a summary of the findings, a drawing for a new iPad, a discount, a Starbuck’s gift card? If you ask for a customer’s time, offer something in return.
At a higher level, customers are more likely to respond when they see their feedback makes a difference. If you routinely ask customers to take surveys, provide customers with updates on the actions you’ve taken based on their feedback. Have you changed a policy? Introduced a new feature or a new product?
Manage Survey Length
B2B surveys will always be longer than consumer surveys. The mobile 10 questions surveys used in consumer markets have limited practical value in B2B markets. Still it is important to keep B2B surveys as short as possible.
- Triage your questions into need-to-have vs. nice-to-have and only ask the need-to-have questions.
- Don’t ask customers to provide descriptive data (sector, size, etc.) that already exists in your internal system. Append the data from your internal systems to the survey data.
- Be honest about the time it will take a customer to complete the survey. A good rule of thumb is 4 closed-ended questions requires 1 minute to complete. A typical 40 question survey will take 10 minutes to complete. And remember grid questions count as individual questions. If you ask how important is the following and then list 10 attributes, you should consider each attribute its own question.
- We recommend keeping your surveys to less than 12 minutes.
- Include a progress bar so customers know where they are in the survey.
Limit Open-Ended Questions
Only ask customers to complete open-ended questions if the data are absolutely necessary. Open-ended questions require effort to complete. Most customers provide short generic answers, that don’t add to what’s already covered in the closed-ended questions. This adds work for the customer without providing meaningful value to the analysis. If you do use open-ends make them as specific as you can – “What can we do better?” is unlikely to provide many insights.
Copy Edit your Survey
As with books, articles and blogs, surveys that are well written are easier to for customer to read, and feel shorter to take. Pick up the classic handbook on writing The Elements of Style by Strunk & White and follow the basic rules: Use simple concise language, use the active voice, eliminate needless words, etc.
Make sure that your answer categories are clear and match the questions being asked. As team members make changes to survey questions, its easy to overlook the subsequent changes required in the answer categories.
A clean, clear survey not only makes the experience better for the respondent—which makes them more likely to take another survey from you—but provides better data by reducing ambiguity.
While there’s no perfect day to send out invitations, Tuesday – Thursday mornings work a little better for B2B surveys.
Send out a reminder 1 week after the initial invitation to customers that haven’t responded. Make sure you don’t send a reminder to customers that already completed the survey.
A 2nd email reminder two weeks after the initial invitation is OK, but we caution against any additional reminders. The additional gain in response is small, and doesn’t justify adding another message to the customer’s inbox.
Once you have the data, compare the respondents to your customer database to see if any customer groups are over or under represented. If so, weight the data before sharing the results with a broader audience.
Look for any biases in the data. For example, are the responses to your surveys coming from the same group of customers. If so you may want to consider holding them out from future surveys.
Regardless of what you do, remember that all data sets require a degree of judgement on the interpretation and use of the results. Survey data is just one data point in any decision.
In the end…
In the end there are no silver bullet solutions to increase response rates to customers surveys, or to gather the best data possible. But there are many incremental actions you can take to get the most out of what you have when you have it.
These actions also acknowledge that a customer survey is a brand touch point. The entire survey experience (the number of invitations, the content of the invitation and the survey, how you use the data) shape perceptions of your product and brand.