Reducing “I Don’t Know” Responses and Missing Survey Data: Implications for Measurement

Deanna C. Denman, Austin S. Baldwin, Andrea C. Betts, Amy McQueen, Jasmin A. Tiro

Research output: Contribution to journalArticlepeer-review

33 Scopus citations

Abstract

Background. “I don’t know” (DK) responses are common in health behavior research. Yet analytic approaches to managing DK responses may undermine survey validity and researchers’ ability to interpret findings. Objective. Compare the usefulness of a methodological strategy for reducing DK responses to 3 analytic approaches: 1) excluding DKs as missing data, 2) recoding them to the neutral point of the response scale, and 3) recoding DKs with the mean. Methods. We used a 4-group design to compare a methodological strategy, which encourages use of the response scale after an initial DK response, to 3 methods of analytically treating DK responses. We examined 1) whether this methodological strategy reduced the frequency of DK responses, and 2) how the methodological strategy compared to common analytic treatments in terms of factor structure and strength of correlations between measures of constructs. Results. The prompt reduced DK response frequency (55.7% of 164 unprompted participants vs. 19.6% of 102 prompted participants). Factorial invariance analyses suggested equivalence in factor loadings for all constructs throughout the groups. Compared to excluding DKs, recoding strategies and use of the prompt improved the strength of correlations between constructs, with the prompt resulting in the strongest correlations (.589 for benefits and intentions,.446 for perceived susceptibility and intentions, and.329 for benefits and perceived susceptibility). Limitations. This study was not designed a priori to test methods for addressing DK responses. Our analysis was limited to an interviewer-administered survey, and interviewers did not probe about reasons for DK responses. Conclusion. Findings suggest that use of a prompt to reduce DK responses is preferable to analytic approaches to treating DK responses. Use of such prompts may improve the validity of health behavior survey research.

Original languageEnglish (US)
Pages (from-to)673-682
Number of pages10
JournalMedical Decision Making
Volume38
Issue number6
DOIs
StatePublished - Aug 1 2018

Keywords

  • DK responses
  • health behavior
  • methodological strategy
  • survey data
  • validity

ASJC Scopus subject areas

  • Health Policy

Fingerprint

Dive into the research topics of 'Reducing “I Don’t Know” Responses and Missing Survey Data: Implications for Measurement'. Together they form a unique fingerprint.

Cite this