Survey design mistakes that can ruin respondent experience

Market Research

It’s fair to say that the goal of creating a survey is to have respondents successfully complete it – and to get a survey complete, respondents must be able (and willing) to answer your questions. Survey sample buyers have much more control over survey success than they realize. In fact, poor survey design is the most likely culprit attributed to high survey drop rates. Surprised? You shouldn’t be. Poor respondent experience is a big deal, and it can be detrimental to your research in a number of ways.

In this post, we will explore specific survey elements that lead to high drop rates and how you can improve survey design to reduce drops. If your survey has a high drop rate (above 20%), this guide can help to identify what may be causing respondents to drop out of your survey.

Common survey mistakes to avoid

Whether you’re a first-time survey designer or your past surveys have been less successful than you’d hoped, you can create effective survey questions by avoiding common mistakes. Once you know what survey design issues tend to confuse, frustrate, or throw off respondents, you can reconfigure your surveys to generate more reliable and accurate responses.

Your questions should be free of all of the following survey errors.

Poor mobile optimization

Ignoring mobile respondents could cut your survey’s feasibility by almost half – which is significant, to say the least. With more than 44% of all traffic on our platform on mobile devices (smartphone or tablet), mobile optimization is key.

In markets like the U.S. and Australia, nearly 50% of respondents are using mobile devices.

That number is even higher in emerging markets, like India and Indonesia, where desktop computers are less pervasive.

However, we know that sometimes a survey absolutely cannot be optimized for mobile, and that’s alright. If that’s the case for you, just make sure your survey is only targeting desktop users, so mobile respondents won’t get stuck in your survey and inflate your drop rate.

Length of interview (LOI) is too long

It’s understandable to want to be thorough with the questions you’re asking in your survey, but it’s important to consider how it’s being received by respondents. Your survey is competing against games, videos, and all the other online options that respondents have when spending time on the internet. If your survey is too long, respondents may become exhausted and choose to spend their time elsewhere – especially if you are not offering a higher price for them to complete it.

Remember, a respondent’s time is valuable, so the amount of effort it takes to complete the survey  must be worth their time. Plus, it is particularly frustrating for respondents who are “termed” during long surveys. (That means a respondent is terminated because their survey answer did not meet the criteria for qualified respondents.) So, if a respondent has spent 20 minutes taking a survey, only to get terminated, how motivated do you think they’ll be to take another survey?

If you’re still not sure about the right LOI, consider that shorter surveys field faster and get better results. You can also take a look at our internal research which shows that the sweet spot for LOI is 10 minutes – short and sweet!

Too many open-end questions (OEs)

Another factor that affects respondent experience is excessive open-end questions (OEs), which lead to respondent fatigue, especially among mobile respondents. Instead of limiting survey-takers to a list of answer options, open-end questions prompt them to expand upon their answer by typing an extended response. These questions can be time-consuming and often require more effort from respondents than they’re willing to give.

This has a big impact on response quality because, in some cases, respondents will type gibberish in order to get through the survey. In other cases, drop rates will increase because respondents may simply exit the survey if there are too many OEs.

So, if you can, try to stick to less than five OEs!

Excessive or lengthy grid questions

Repetitive grids also cause respondent fatigue, which often leads to respondents straightlining, speeding through, or abandoning the survey entirely.

This is especially important to consider for mobile devices, as grids require manual scrolling and can result in display issues. Lengthy grid questions will likely make respondents have to work too hard to complete the survey.

Repetitive questions

Respondents are impatient when it comes to questions being asked in different ways over and over. For example, a survey that contains the questions, “Do you like hanging out in large groups?”, “Do you enjoy group gatherings?”, and “Do you often socialize in large friend groups?” may annoy the respondent.

Although some researchers intentionally include repeat questions as a quality check to measure the consistency of respondent’s answers, we recommend employing that approach very thoughtfully so respondents are not left with a poor impression.

Media loading issues (videos, images, and audio)

If your media has a long loading time, it will take longer to download the data, which can actually cost mobile respondents more with their service providers than the survey incentive is paying them.

Another problem with long loading times is that the respondent may think the link is broken – so they’re likely to leave the survey before the media loads.

Trouble with translation quality

Getting survey translations right is very important because respondents who cannot understand the survey will drop from the survey. Or, they’ll complete the survey without understanding it, resulting in lower quality responses.

It’s important to note that Google Translate is not a sufficient survey translation tool – professional translation is always required.  Questions asked through Google Translate can easily get lost in translation, depending on the sentence structure and concept complexity, creating a confusing sentence that doesn’t make sense in the deciphered language. Google Translate is also known to produce sentences riddled with grammatical errors.  Survey questions like these can make your questionnaire seem erroneous and unprofessional.

Loaded questions

Loaded questions are some of the most common survey design mistakes that we recommend adamantly avoiding when crafting your questionnaire. Loaded questions assume something about a respondent that may not be true, forcing them to submit an answer that inaccurately reflects their opinions or feelings.

For example, a loaded question might ask a survey taker, “Where do you like to go hiking?” Regardless of how the respondent answers this question, they must imply that they like to hike, which may not be true. If the individual doesn’t enjoy hiking, they have no way to answer the question truthfully.

You can resolve this issue by asking a preliminary question that will give insight into the respondent’s feelings toward hiking — for example, “Do you like hiking?” — then use, “Where do you like to go hiking?” as a follow-up question.

Complicated language

Your surveys should pose each question clearly, concisely, and in a way that any respondent could understand. That means using simple, uncomplicated language that avoids technical jargon or acronyms that may confuse survey-takers. If your questionnaire includes terms or concepts that may not be common knowledge to your survey sample, you should provide definitions and examples to supplement your respondents’ understanding.

“Do you own a gaming console, such as a PlayStation, Xbox, or Nintendo Switch?” would be a question that speaks the reader’s language. It ensures that even if the respondent isn’t familiar with the term “gaming console,” the listed examples will give them a clear idea of what it means.

Leading questions

Every survey question should be completely objective, using neutral language that abstains from attempting to influence the respondent’s answer. Leading questions feature emotionally charged words, images, and stereotypes that promote implicit biases and aim to push the survey-taker to select a particular response. These types of errors in survey questions are usually fairly obvious to respondents and often result in high drop-out rates.

For example, the question, “How would you rate the flavor of this brand’s creamy and delicious chocolate bars?” uses biased language that attempts to coerce respondents into answering the question positively. A more neutral question would be, “How would you rate the flavor of this brand’s chocolate bars?”.

Vague questions

One of the most crucial mistakes to avoid during questionnaire design is creating vague questions. Indirect and ambiguous survey questions can leave respondents feeling lost, confused, or frustrated. To avoid this common error, make sure your language is straightforward and easy to interpret so that you get the most accurate responses from your survey-takers.

The question, “Do you drink milk regularly?” is a vague question that may result in multiple reader interpretations. The word “regularly” can mean different things to different people — one person may think regular consumption means drinking milk daily, while another may consider it to mean once or twice a week. A better way to phrase the question would be, “How often do you drink milk?” This rephrasing is clear, direct, and warrants a concrete answer.

When in doubt, test it out!

Testing, from beginning to end, is essential before launching any survey. This ensures the mechanics of the survey (redirects, links, buttons, etc.) are working correctly.

Testing is also the best way to diagnose a high drop rate. A natural checkpoint to look at your drop rate is after a “soft launch” of 10% of the total needed completes. If the drop rate is high (above 20%), test to see what survey elements might be leading to a poor respondent experience. If it’s hard to finish your own survey, respondents will have an even harder time completing it.

As we mentioned earlier, nearly half of our survey respondents are on mobile devices. This is important to note because, while survey design mistakes are challenging for all respondents, they can be deal-breakers for mobile users. So, a poorly designed survey can have a significant effect on the outcome of your study.

To learn more about survey design best practices, check out our blog posts on questionnaire design and online survey supply and demand. You can also contact our team if you have questions about improving your survey design!