The Problem With Polls Isn’t the Lies Respondents May Tell
Originally appeared in Entrepreneur online magazine, June 14, 2016.
On June 23, the people of Britain will decide whether the United Kingdom should withdraw from the European Union. It’s a question that hasn’t been up for debate since 1975, when voters considered a similar issue.
How will they decide this time? If you were to consult online polling data, you’d think the referendum was a dead heat. Back in late May, prominent U.K. pollster ICM showed that voters were evenly divided, with the percentages of those wanting “out” and those wanting “in” practically tied, at 45 percent.
YouGov also shows the yeas and nays practically tied — and playing tug-of-war over just 1 percentage point.
But phone polls and statement by pundits reveal a far different forecast. Judging by those sources, the likelihood of Britain’s leaving the Union is mighty slim, with “remain” commanding a double-digit lead in multiple phone polls from the past month.
That discrepancy might lead some people to toss out polling data altogether. If that U.K. data can’t provide a true picture of public sentiment, why even bother gathering it?
But the actual situation is quite different: Because, when it comes to polling, the important thing isn’t the data itself but how you capture and interpret it.
Surveys: startup marvel or mirage?
Consider the startup space, for example. Start-ups crash for all sorts of reasons, but an alarming percentage of startup failures can be tied to founders who didn’t do their research. Smart start-ups, on the other hand, conduct research that informs product development, prevents product-market fit problems and measures brand sentiments.
Such market research, by surveys, isn’t common among geopolitical organizations. But, what start-ups have in common with those geopolitical organizations is their need to pay close attention to how their surveys are gathered and interpreted.
The problem? Raw data is always up for interpretation, so entrepreneurs conducting surveys must think critically about the methods, questions and biases behind their results.
To truly take the temperature of your particular entrepreneurial market niche, be sure to put controls in place that promote accurate collection and assessment of data. Here are some recommended strategies:
1. Shut out biases at every turn.
Like dye tossed into a washing machine, biases can quickly taint every data point a survey turns up. One of the greatest sources of bias can come from a study’s moderator: Everything from this person’s facial expression, to tone, to the manner of dress, can adversely color results.
To guard against this problem, conduct surveys anonymously, asking only for information central to the survey. Use mobile or in-app surveys to capture data conveniently. Customers are clamouring to interact with brands online, which is why it’s so surprising that just 17 percent of researchers leverage mobile surveys.
2. Make ‘opt-in’ the standard.
Customers want to respond to surveys on their terms, so let them opt-in on their own schedules. That’s where an in-app survey comes in handy: It doesn’t force itself upon users, so it prevents skewed results from people who just click through absentmindedly.
Try rewarding respondents with fewer ads, which most people would rather avoid entirely. A recent study found that 83 percent of people surveyed said they preferred surveys to comparatively intrusive mobile ads.
3. Screen users at the start.
When structured properly, a screening question can significantly boost a survey sample’s relevance. Filter respondents by their answers to questions about attitudes, behaviors, opinions or prior experiences.
For example, if you manufacture watches and want to know about consumers’ favorite timepieces, you might ask, “When’s the last time you wore a watch?” Use the results to screen out bare-wristed users irrelevant to your study.
4. Keep it brief.
Let’s face it: Nobody’s going to spend half an hour answering questions out of the goodness of his or her heart. So, considering that 52 percent of people won’t spend even three minutes on a survey, limit your questions to 10 or fewer.
Start by determining the survey’s objective: What do you want to glean from the findings? Then, work through the questions and answers, sticking to 140 characters or fewer for each. Eighty percent of people abandon surveys halfway through, so prioritize your questions to focus on what you really need to know; and chop it off after 10.
Yes, ten is tough, but it’s do-able, with a focused survey. After Uber changed its logo, we wanted to determine whether that was a wise branding move. It took just five questions and 2,000 respondents to arrive at an answer: 44 percent of respondents didn’t know what the logo represented.
In the end, polls and surveys can’t tell the future, but they can tell you what consumers are thinking. So, take survey data for what it’s worth, but never take it at face value. As with most things in life, it’s always up to interpretation.