EXPERT INSIGHTS
Jun-07-2021
Amelia Carry and Jackson Kushner
Surveys are everywhere these days, and survey fatigue is a real, documented issue that brands have to counteract without alienating their customers. It’s a tricky balance to strike — and doing it without data would make it even trickier.
So we did some research. We ran our own survey (ironic, we know) with over a thousand American consumers, to determine two things:
How do they really feel about surveys, what sorts of surveys do they prefer, and do they respond truthfully when brands solicit feedback?
How many times can we write the word survey before it starts to sound weird?
The answer to the second question, as it happens, is four — which is unfortunate, given how much of this blog is still to come. But let’s press on anyway.
We can start with a simple question: how did we get here?
Let’s start by saying that the point of this blog is not to undermine the importance of surveys. They’re extremely useful and beneficial for businesses. The trend of brands soliciting customer feedback wouldn’t have started, let alone become so popular, if it didn’t help them improve their Customer Experience (CX) and drive better business outcomes. This is why 86% of consumers say they’ve received at least one request for survey feedback from a brand in the past year. So, why are surveys important? Simply put, accurate surveys are important because brands use the feedback they gather to identify customer pain points and solve problems more quickly and efficiently.
Surveys won’t be going anywhere soon. Indeed, customers understand this simple fact, and even recognize that it can benefit them: according to our research, a whopping 64% of consumers appreciate when a brand seeks out their feedback or opinion in a survey.
Unsurprisingly, consumers are also more likely to respond to a survey after they’ve just concluded an interaction with a brand, when the interaction is fresh in their minds. Numbers drop significantly when customers are asked to rate their overall satisfaction or relationship with a brand:
Given the utility of this information, and given that consumers still rate themselves as likely to respond, surveys won’t be a thing of the past until well into the future.
So, if brands love giving surveys and consumers don’t mind responding to them, shouldn’t the blog end right here?
Not quite. While surveys are an important part of a customer engagement strategy, they don’t tell the whole story. Certain factors can influence the survey accuracy, and that means brands should also look elsewhere for their customer engagement data. Let’s cover the three biggest hurdles to running accurate surveys.
We’ll start with the most obvious issue: response rates to survey prompts are quite low. Among consumers in our own survey, only seven percent claimed to respond to every survey they receive. 22% said they respond to most; 34% said they respond to some. But even these numbers are probably exaggerated. Customer Thermometer estimates average response rates at somewhere between five percent and 30%. If you’re only getting feedback from a third of your customers (on a good day), you’re missing out on the other two thirds of potential data.
To make matters worse, there’s no reason to think the third that responds is representative of the whole population; in fact, there are good reasons to think the opposite. Unsurprisingly, our results showed that customers are far more willing to respond to surveys when they’ve had a particularly good or particularly bad experience. This means that results will often be skewed. For instance, a huge number of happy customers responding to the survey may outweigh the customers who thought their service was just okay but chose not to give that feedback. Unfortunately, that feedback would have been far more valuable to the brand than the feedback they actually received.
Unfortunately, you can’t combat the issue of low response rate simply by issuing more surveys and hoping to get a greater number of responses. Survey response rates drop precipitously when customers feel they’re being asked for feedback over and over — that is, when they’re feeling “survey fatigue.” 41% of consumers say they're not likely to respond to a survey if they've already given feedback to the brand's employees.
Brands can take a number of measures to drive response rates up. One of the most popular is to offer a small compensation for completing the survey — perhaps a small gift card, a discount, or an entry into a sweepstakes. And as expected, this is a reliable way to increase response rate.
Unfortunately, this can also affect survey accuracy. 23% of our own survey respondents admit that they sometimes take feedback surveys just to get the prize, but don’t really provide honest feedback. This means brands need to take additional measures to check for data quality in their survey responses: eliminating straightliners, speeders, and low-quality open-ended commenters. They should also keep this in mind during data interpretation, as there is probably a wider margin of error than it may seem on the surface.
Another way results sometimes get skewed is by employees. 35% of our survey respondents say they have been coached by an employee to give a positive rating on a feedback survey. For instance, an employee might advise a customer to simply “press five” on their keypad when prompted, significantly increasing the chances of a “five out of five” response from that customer. Unfortunately there is very little brands can do to avoid this sort of behavior, especially when they tie employee incentives to survey results.
Likewise, 37% of customers are hesitant to provide valuable negative feedback if they think it might impact an employee’s job; this can contribute to overweighting positive feedback.
A third issue that brands have to handle when they solicit survey feedback is that data is rarely complete — and not just because most people don’t respond. Rather, this data is so often incomplete because surveys typically only ask about a single interaction that a customer has had with a brand. Take a look:
As you can see, only 35% of respondents say they’ve been asked in a survey to rate their overall experience or relationship with a brand, unrelated to a specific transaction. This is for good reason. Customers are far more likely to respond to very short, very simple surveys that get sent only seconds or minutes after an interaction. After that window has passed, they forget what happened, forget to respond, or simply lose interest. Customers are also 12–16% more likely to respond to surveys about specific interactions than surveys about overall CX, regardless of when the brand sends the survey. So it’s no wonder that surveys are almost always tied to unique interactions.
However, customer experience comprises so much more than just individual interactions, and brands can’t afford to ignore that fact. An enormous majority (76%) of consumers think a brand should care about their overall experience, not just their feedback on a single interaction or purchase. Since surveys are unlikely to provide reliable data of this sort, brands are forced to look elsewhere for context.
Again, none of this is to diminish the importance of accurate surveys (they’re very important) or to say that brands should stop using them (they shouldn’t). Surveys are important because they give customers a chance to voice concerns and brands a chance to respond to those concerns with real changes to their CX. But it’s also important to analyze survey results with context.
We can start by attempting to understand exactly why consumers respond to surveys (aside from getting a prize at the end). This chart might help:
As you can see, the top several reasons to respond to a survey have to do with providing positive or negative feedback. No surprise there — as we said above, it’s when customers feel most passionate about your brand in one direction or the other that they’re most likely to respond.
But we can get another, perhaps even more important insight from the bottom of this chart. Likelihood of response drops significantly when customers never plan to interact with a brand again. This reveals a crucial detail about customer motivation:
Customers want to give your brand feedback — and they do, all the time. But it’s usually not in survey form. In fact, 37% of respondents say they’ve ignored a survey request because they had already given feedback somewhere else. And this feedback — unsolicited feedback — is often more likely to reflect real opportunities for improvement than survey data because it doesn’t succumb to the issues described above.
Where does all this unsolicited feedback come from? The answer is that customers are providing you with data every time they interact with your brand, whether it’s a direct message on social media, an online review, a purchase, a post in your online community, or even a phone call. We asked where else customers provided feedback to brands, and they gave us a long list:
By analyzing these interactions, your brand can gather information about the entire customer experience without ever having to request a survey response. In fact, 47% of consumers say their interactions with a brand’s employees are more meaningful than any information they can provide in a survey — and this is exactly why it’s important to take stock of both solicited and unsolicited feedback. Although this strategy cannot replace surveys, it can provide them with vital context and make it easier for brands to improve every customer interaction across every channel.