Economics professor and “Freakonomics” author Steven Levitt knows a thing or two about collecting data-but he can’t always convince others the value of gathering new information. For example, advertising executives at a well-known retail store once refused to take his advice because it would mean admitting they didn’t know a vitally important fact: do newspaper ads increase sales?
Levitt’s advice was simple. He suggested that the retailer stop running newspaper ads in 20 test markets for a short time. After the experiment, the ad execs could simply compare the sales figures from the test markets and the control group markets to see if newspaper inserts actually affect sales.
You might think the executives would jump at this idea. It would allow them to save money on newspaper inserts for a short time and give them valuable knowledge about how effective that advertising method is.
But they didn’t go for it. The executives felt that agreeing to this experiment would signal to their bosses that they don’t actually know how effective news ads are-and leave them vulnerable to being fired.
Faulting these retailers for wanting to protect their jobs seems difficult, but this story does point out the value of giving people the option of saying “I don’t know.” If the executives had felt secure enough to try Levitt’s experiment, the company they worked for might have adjusted its advertising strategy, saved money, and increased profits.
When you conduct a customer satisfaction survey, you want your respondents to feel similarly empowered. You want them to give honest answers, including being able to say “I don’t know.” Otherwise, the data you collect can be inaccurate and unhelpful. Let’s discover the power of “I don’t know.”
The Problems With the “I Don’t Know” Option
Not all survey scientists promote giving an “I don’t know” option on multiple choice questions. Essentially, they argue that anytime someone chooses that response, the survey question has been wasted and no useful data is collected. After all, conducting surveys takes time, effort, and money, so most people want to ensure they gather as much data as possible.
Another argument against “I don’t know” is that this choice lets survey participants respond passively. They can just rush through the survey instead of thinking about the questions it poses and forming opinions about those questions.
Similarly, many survey makers discourage this option because you can’t ensure that only people who truly don’t know select that answer. Consequently, some respondents with valuable opinions may keep those opinions to themselves and select “I don’t know” instead.
The Pros of “I Don’t Know”
Despite all the possible cons of letting respondents say “I don’t know,” that survey answer has value when used correctly. One advantage of “I don’t know” is that it doesn’t force people to select an answer that isn’t true for them.
Giving an “I don’t know” option creates a dilemma similar to what occurred when you took a multiple-choice exam in school. When you encountered a question you didn’t know the answer to, you made your best guess. Survey-takers might do the same. They’ll choose the answer you think you want to hear instead of the answer that reflects their true feelings.
Secondly, “I don’t know” responses aren’t actually worthless. If a significant percentage of people choose that answer, that result indicates that something is amiss. For example, if a survey question asked “What qualities do you most associate with our brand?” and 15% of respondents said “I don’t know,” you’d know you need to create a clearer branding message to attract and retain customers.
How to Use “I Don’t Know” Effectively
Ultimately, including “I don’t know” options in your survey makes sense-if you do so wisely. Follow these guidelines to yield meaningful results from surveys that include this answer choice.
For starters, you won’t need to have this option on every question. You can expect customers to know the answers to questions about customer identity and demographics (e.g. gender, household income, age), so you don’t have to include an “I don’t know” choice in those questions.
Second, remember that offering “I don’t know” as a choice works best when you survey a larger audience. Those answers have more impact on the data collected than if only a small number of people complete your survey. As a result, “I don’t know” answers can be less valuable in small surveys.
Third, you can phrase “I don’t know” in many different ways so that this neutral response fits the question. Try these variations:
- Not sure
- Not applicable
- Other, with a box where respondents can fill in their own answer
Careful wording in your survey matters a lot, so phrase your “I don’t know”s carefully. (We wrote more about effective survey wording in a previous post.)
Finally, include a few open-ended, follow-up questions for people who choose “I don’t know.” People may choose to say “I don’t have experience with that service you offer” or “I felt torn between two of the options.” The responses to these follow-up questions will help you evaluate the value of “I don’t know” answers.
So, the next time you send out a survey, use the tactics above to place “I don’t know” options in appropriate questions. Read our other blog posts for more advice about crafting a survey that yields usable results.
To learn more about Opinionmeter’s Enterprise Survey Solution, please contact Opinionmeter at 888.676.3837 or visit www.opinionmeter.com. And please share this with any of your colleagues who might find it of interest. Thank you.