• Presentation

Minimizing Respondent Effort Increases Use of Definitions in Web Surveys

Citation

Peytchev, A. A., Conrad, F. G., Couper, M. P., & Tourangeau, R. (2007, May). Minimizing Respondent Effort Increases Use of Definitions in Web Surveys. Presented at American Association for Public Opinion Research Conference, Anaheim, CA.

Abstract

Accurate survey data requires respondents interpret survey questions as intended by researchers. In interviewer-administered surveys allowing the interviewer correct misunderstanding can improve response accuracy (e.g. Schober, Conrad, and Fricker, 2004). In contrast, it is hard to clarify misconceptions in self-administered questionnaires. In the current study, we examine clarification in web questionnaires, a self-administered mode in which it is possible to request clarification (definitions) much as in interviews. The main question is whether respondents will use the clarification that is available. Previous experiments have shown that making definitions easier to obtain increases their use (Conrad, Couper, Tourangeau, and Peytchev, 2006). Yet providing definitions only to those who request them can be seen as a deviation from the standardized interview. In the current study we examine whether providing definitions to all respondents can be more effective than even the easiest identified request method, without detrimental effects from providing more information to all respondents. We also manipulate the awareness that definitions are useful and the length of the definitions. We hypothesized that respondents would use definitions more when always displayed than when they must be requested because eye movements involve less effort than mouse movements. We tested this and other hypotheses examining answers, response times, and break-offs, in a web survey with about 3,000 respondents. Definitions that were always displayed were used more based on the time to complete the questions and the effect on responses, suggesting that an interactive questionnaire may deter use of definitions because it involves extra respondent actions. Contrary to our expectations, prior training questions decreased the use of definitions, suggesting that respondents may value minimal effort (not reading definitions) over improved understanding (reading definitions). We close by discussing implications for the design of web surveys that promote accurate question understanding.