Reducing “I Don’t Know” Responses and Missing Survey Data

Implications for Measurement

Deanna C. Denman, Austin S. Baldwin, Andrea C. Betts, Amy McQueen, Jasmin A. Tiro

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

Background. “I don’t know” (DK) responses are common in health behavior research. Yet analytic approaches to managing DK responses may undermine survey validity and researchers’ ability to interpret findings. Objective. Compare the usefulness of a methodological strategy for reducing DK responses to 3 analytic approaches: 1) excluding DKs as missing data, 2) recoding them to the neutral point of the response scale, and 3) recoding DKs with the mean. Methods. We used a 4-group design to compare a methodological strategy, which encourages use of the response scale after an initial DK response, to 3 methods of analytically treating DK responses. We examined 1) whether this methodological strategy reduced the frequency of DK responses, and 2) how the methodological strategy compared to common analytic treatments in terms of factor structure and strength of correlations between measures of constructs. Results. The prompt reduced DK response frequency (55.7% of 164 unprompted participants vs. 19.6% of 102 prompted participants). Factorial invariance analyses suggested equivalence in factor loadings for all constructs throughout the groups. Compared to excluding DKs, recoding strategies and use of the prompt improved the strength of correlations between constructs, with the prompt resulting in the strongest correlations (.589 for benefits and intentions,.446 for perceived susceptibility and intentions, and.329 for benefits and perceived susceptibility). Limitations. This study was not designed a priori to test methods for addressing DK responses. Our analysis was limited to an interviewer-administered survey, and interviewers did not probe about reasons for DK responses. Conclusion. Findings suggest that use of a prompt to reduce DK responses is preferable to analytic approaches to treating DK responses. Use of such prompts may improve the validity of health behavior survey research.

Original languageEnglish (US)
JournalMedical Decision Making
DOIs
StateAccepted/In press - Jun 1 2018

Fingerprint

Health Behavior
Interviews
Aptitude
Health Surveys
Research
Research Personnel
Surveys and Questionnaires
DK 1

Keywords

  • DK responses
  • health behavior
  • methodological strategy
  • survey data
  • validity

ASJC Scopus subject areas

  • Health Policy

Cite this

Reducing “I Don’t Know” Responses and Missing Survey Data : Implications for Measurement. / Denman, Deanna C.; Baldwin, Austin S.; Betts, Andrea C.; McQueen, Amy; Tiro, Jasmin A.

In: Medical Decision Making, 01.06.2018.

Research output: Contribution to journalArticle

@article{266c93d0f1cd43e591fcd429c698e96b,
title = "Reducing “I Don’t Know” Responses and Missing Survey Data: Implications for Measurement",
abstract = "Background. “I don’t know” (DK) responses are common in health behavior research. Yet analytic approaches to managing DK responses may undermine survey validity and researchers’ ability to interpret findings. Objective. Compare the usefulness of a methodological strategy for reducing DK responses to 3 analytic approaches: 1) excluding DKs as missing data, 2) recoding them to the neutral point of the response scale, and 3) recoding DKs with the mean. Methods. We used a 4-group design to compare a methodological strategy, which encourages use of the response scale after an initial DK response, to 3 methods of analytically treating DK responses. We examined 1) whether this methodological strategy reduced the frequency of DK responses, and 2) how the methodological strategy compared to common analytic treatments in terms of factor structure and strength of correlations between measures of constructs. Results. The prompt reduced DK response frequency (55.7{\%} of 164 unprompted participants vs. 19.6{\%} of 102 prompted participants). Factorial invariance analyses suggested equivalence in factor loadings for all constructs throughout the groups. Compared to excluding DKs, recoding strategies and use of the prompt improved the strength of correlations between constructs, with the prompt resulting in the strongest correlations (.589 for benefits and intentions,.446 for perceived susceptibility and intentions, and.329 for benefits and perceived susceptibility). Limitations. This study was not designed a priori to test methods for addressing DK responses. Our analysis was limited to an interviewer-administered survey, and interviewers did not probe about reasons for DK responses. Conclusion. Findings suggest that use of a prompt to reduce DK responses is preferable to analytic approaches to treating DK responses. Use of such prompts may improve the validity of health behavior survey research.",
keywords = "DK responses, health behavior, methodological strategy, survey data, validity",
author = "Denman, {Deanna C.} and Baldwin, {Austin S.} and Betts, {Andrea C.} and Amy McQueen and Tiro, {Jasmin A.}",
year = "2018",
month = "6",
day = "1",
doi = "10.1177/0272989X18785159",
language = "English (US)",
journal = "Medical decision making : an international journal of the Society for Medical Decision Making",
issn = "0272-989X",
publisher = "SAGE Publications Inc.",

}

TY - JOUR

T1 - Reducing “I Don’t Know” Responses and Missing Survey Data

T2 - Implications for Measurement

AU - Denman, Deanna C.

AU - Baldwin, Austin S.

AU - Betts, Andrea C.

AU - McQueen, Amy

AU - Tiro, Jasmin A.

PY - 2018/6/1

Y1 - 2018/6/1

N2 - Background. “I don’t know” (DK) responses are common in health behavior research. Yet analytic approaches to managing DK responses may undermine survey validity and researchers’ ability to interpret findings. Objective. Compare the usefulness of a methodological strategy for reducing DK responses to 3 analytic approaches: 1) excluding DKs as missing data, 2) recoding them to the neutral point of the response scale, and 3) recoding DKs with the mean. Methods. We used a 4-group design to compare a methodological strategy, which encourages use of the response scale after an initial DK response, to 3 methods of analytically treating DK responses. We examined 1) whether this methodological strategy reduced the frequency of DK responses, and 2) how the methodological strategy compared to common analytic treatments in terms of factor structure and strength of correlations between measures of constructs. Results. The prompt reduced DK response frequency (55.7% of 164 unprompted participants vs. 19.6% of 102 prompted participants). Factorial invariance analyses suggested equivalence in factor loadings for all constructs throughout the groups. Compared to excluding DKs, recoding strategies and use of the prompt improved the strength of correlations between constructs, with the prompt resulting in the strongest correlations (.589 for benefits and intentions,.446 for perceived susceptibility and intentions, and.329 for benefits and perceived susceptibility). Limitations. This study was not designed a priori to test methods for addressing DK responses. Our analysis was limited to an interviewer-administered survey, and interviewers did not probe about reasons for DK responses. Conclusion. Findings suggest that use of a prompt to reduce DK responses is preferable to analytic approaches to treating DK responses. Use of such prompts may improve the validity of health behavior survey research.

AB - Background. “I don’t know” (DK) responses are common in health behavior research. Yet analytic approaches to managing DK responses may undermine survey validity and researchers’ ability to interpret findings. Objective. Compare the usefulness of a methodological strategy for reducing DK responses to 3 analytic approaches: 1) excluding DKs as missing data, 2) recoding them to the neutral point of the response scale, and 3) recoding DKs with the mean. Methods. We used a 4-group design to compare a methodological strategy, which encourages use of the response scale after an initial DK response, to 3 methods of analytically treating DK responses. We examined 1) whether this methodological strategy reduced the frequency of DK responses, and 2) how the methodological strategy compared to common analytic treatments in terms of factor structure and strength of correlations between measures of constructs. Results. The prompt reduced DK response frequency (55.7% of 164 unprompted participants vs. 19.6% of 102 prompted participants). Factorial invariance analyses suggested equivalence in factor loadings for all constructs throughout the groups. Compared to excluding DKs, recoding strategies and use of the prompt improved the strength of correlations between constructs, with the prompt resulting in the strongest correlations (.589 for benefits and intentions,.446 for perceived susceptibility and intentions, and.329 for benefits and perceived susceptibility). Limitations. This study was not designed a priori to test methods for addressing DK responses. Our analysis was limited to an interviewer-administered survey, and interviewers did not probe about reasons for DK responses. Conclusion. Findings suggest that use of a prompt to reduce DK responses is preferable to analytic approaches to treating DK responses. Use of such prompts may improve the validity of health behavior survey research.

KW - DK responses

KW - health behavior

KW - methodological strategy

KW - survey data

KW - validity

UR - http://www.scopus.com/inward/record.url?scp=85049825738&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85049825738&partnerID=8YFLogxK

U2 - 10.1177/0272989X18785159

DO - 10.1177/0272989X18785159

M3 - Article

JO - Medical decision making : an international journal of the Society for Medical Decision Making

JF - Medical decision making : an international journal of the Society for Medical Decision Making

SN - 0272-989X

ER -