Blog Layout

Survey Length

Jun 19, 2023

Share this article:

Why should someone care about survey length?

The greatest challenges in medical technology marketing research include limited universe and cost per survey, since high incentives are paid to participating physicians, nursing staff, hospital/department administrators and executives. This is why marketing research agencies should always:


  1. Maximize the sample quality versus the cost;
  2. Ensure the survey is focused to relevant qualified targets through a rigorous screener;
  3. Optimize survey design to address the study objectives while making the survey experience enjoyable and professional.


Some agencies will field online, self-administered surveys with a length of survey (LOS) that exceeds 30 minutes, sometimes even extends to 60 minutes. This is not advisable, because healthcare providers and hospital administrators:

  • Are busy professionals and are likely to “rush through” to the end of a survey or merely abandon it, and
  • Will likely experience survey fatigue.


Regardless of the cause, this creates data quality challenges. To avoid these issues, The MarkeTech Group (TMTG) has made it a standard practice to rarely exceed a 20-minute LOS. In our business, it is unusual that a study’s objectives require a longer survey. In this paper, a literature review on survey duration and completion rates is supplemented with results from TMTG research that demonstrates the need to refrain from long online survey durations.


How to measure data quality with an online survey?


Measuring survey data quality is complex, and researchers typically use metrics linked to survey participation, such as the response rates, incidence rates and completion rates.


  • The response rate (RR), defined as the number of people who started the survey divided by the number of people invited to participate, mainly indicates the relevance and interest of the targeted sample participating in the survey.
  • The incidence rate (IR), defined as the number of people who qualified for the survey after passing the screener divided by the number of people who started the survey, is an important element to assess not only to ensure that only relevant respondents take the survey but also to allow market projections. A stringent screener (low IR) could lead to feasibility and cost issues as well as market projection limitation. On the other end, a high IR could be a problem if some unqualified respondents take the survey therefore decreasing the trust in the data.
  • The completion rate (CR), defined as the number of people who completed the survey divided by the number of people who started, is an insightful design quality measurement, because it directly relates to the experience and interactions of respondents with the survey. A low completion rate means respondents drop out of the survey at some point for various reasons. In such cases, root cause analysis can be implemented by conducting post-survey interviews with these respondents. The interviews often reveal a frustrating survey experience because a survey is too long or contains cumbersome questions that can be seen as misleading.


Literature Review shows long LOS can create challenges


Current studies examining online survey participation provide insights to understand the impact of survey length on participation and completion rates. Below is a summary of survey length literature that helps optimize survey design:


  1. Respondents expect LOS less than 20 minutes
  • When participants are asked directly, the ideal survey length is a median of 10 minutes and maximum of 20 minutes.1
  1. There is a negative relationship between survey length and completion and response rates.
  • By examining 25,080 real-world web surveys conducted by a single online panel, a regression analysis demonstrates that the completion rate decreases when the number of questions, pages, or words increases.2
  • When testing characteristics recognized to affect response rates including LOS, with 2,000+ respondents of a web-based, self-administered survey, the shorter survey of 91 items (announced to take between 10 and 20 minutes) yielded a response rate 12 percentage points higher than the longer survey of 359 items (announced to take between 30 and 60 minutes). 3
  • When comparing two different versions of a survey, the short version (15–30 min) has a higher response rate than the long version (30–45 min).4
  • When inviting 10,000 people to participate in an online survey without mentioning the length prior to start, a higher dropout rate was found in the longer survey of 42 questions (29%) versus the shorter survey of 20 questions (23%). 5
  1. Longer stated LOS correlates with lower response and completion rates
  • When testing three different announced LOS (10, 20, or 30 min) for the same survey, the longer the stated length, the fewer respondents started and completed the questionnaire. In addition, the answers to open-ended questions positioned later in the questionnaire were faster, shorter, and more uniform than answers to open-ended questions positioned near the beginning.6
  • In another study, 20,000 participants were invited to complete an identical online survey, and when the stated LOS was 5 minutes compared to 15 minutes respondents were more likely to begin the survey (higher start rate). The completion rate was also higher for the stated 5 minutes survey (9.3%) compared to the stated 15 minutes (6.9%) despite surveys being identical.7
  • When comparing two stated survey lengths, 3–5 minutes and 10-15 minutes, on completion rates, the shorter stated length survey had a higher completion rate (while the 10-15 min survey was actually the same as the 3-5 min one). 8


The MarkeTech Group’s (TMTG) internal studies corroborate literature


TMTG recently published a peer-reviewed scientific paper focused on conjoint and survey design efficiency to understand the preferences of imaging professional participants when taking online surveys. Although offering an adequate incentive and user-friendly platform were important, LOS had a significant impact on respondent preference. Specifically, when comparing surveys with LOS ranging from 10 to 25 minutes, the 10-minute survey captured 47% preference share, while the 20- and 25-minute surveys captured 16% and 6% respectively. This clearly indicates that preference decreases when survey duration increases.


Conclusion


When conducting quantitative healthcare marketing research through an online survey platform, all aspects of survey design, in particular LOS, must be taken into account. A short LOS limits the risk of collecting incomplete survey responses, which is especially critical when the respondent pool is limited as with healthcare providers. TMTG strongly supports a LOS not exceeding 20 minutes to optimize participation and ensure complete data sets, so that business decision making relies on completes information. TMTG is an experienced provider of quantitative marketing research methodologies to address business questions around new product development and/or pricing. Our efficient design process always takes the best approach to optimize data accuracy and the project budget. 


Further details on quantitative methodologies

Scales and measurement: how to ensure quality result with consistent measurement?
Gauging Customer Preference
: How to create realistic consumer choice situations to understand important features of an offer?
Improving Online Survey Efficiency
: using conjoint analysis to understand preference for online survey layout


References

  1. Ideal and Maximum Length for a Web Survey; Revilla et al.; International Journal of Market Research, 2017, volume 59, issue 5, pages 557-565
  2. Examining Completion Rates in Web Surveys via Over 25,000 Real-World Surveys; Liu at al.; Social Science Computer Review, 2017, volume 36, issue 1, pages 116-124
  3. Compensating for Low Topic Interest and Long Surveys: A Field Experiment on Nonresponse in Web Surveys; Marcus et al.; Social Science Computer Review, 2007, volume 25, issue 3, pages 372-383
  4. Response Rate and Response Quality of Internet-Based Surveys: An Experimental Study; Deutskens et al.; Marketing Letters, 2004, volume 15, issue 1, pages 21-36
  5. The Influence of the Design of Web Survey Questionnaires on the Quality of Responses; Ganassali ; Survey Research Methods, 2008, volume 2, issue 1, pages 21-32
  6. Effects of Questionnaire Length on Participation and Indicators of Response Quality in a Web Survey; Galesic et al.; Public Opinion Quarterly, 2009, volume 73, issue 2, pages 349-360
  7. The Influence of Web-based Questionnaire Presentation Variations on Survey Cooperation and Perceptions of Survey Quality; Walston et al.; Journal of Official Statistics, 2006, volume 22, issue 2, pages 271-291
  8. How You Ask Counts: A Test of Internet-Related Components of Response Rates to a Web-Based Survey; Trouteaud; Social Science Computer Review, 2004, volume 22, issue 3, pages 385-392
30 Apr, 2024
2024 MICI Q2 Shows More Confidence in Growth Than Q4 2023
11 Sep, 2023
Interview with*: Mr. Tomer Levy (TL), MBA, VP, Strategic Portfolio Mr. Evgueni Loukipoudis (EL), PhD, CTO/CIO, Imaging Workflow & Care Solutions 
More Posts
Share by: