A reliability study for evaluating information extraction from radiology reports

George Hripcsak, Gilad J. Kuperman, Carol Friedman, Daniel F. Heitjan

Research output: Contribution to journalArticlepeer-review

35 Scopus citations

Abstract

Goal: To assess the reliability of a reference standard for an information extraction task. Setting: Twenty-four physician raters from two sites and two specialties judged whether clinical conditions were present based on reading chest radiograph reports. Methods: Variance components, generalizability (reliability) coefficients, and the number of expert raters needed to generate a reliable reference standard were estimated. Results: Per-rater reliability averaged across conditions was 0.80 (95% CI, 0.79- 0.81). Reliability for the nine individual conditions varied from 0.67 to 0.97, with central line presence and pneumothorax the most reliable, and pleural effusion (excluding CHF) and pneumonia the least reliable. One to two raters were needed to achieve a reliability of 0.70, and six raters, on average, were required to achieve a reliability of 0.95. This was far more reliable than a previously published per-rater reliability of 0.19 for a more complex task. Differences between sites were attributable to changes to the condition definitions. Conclusion: In these evaluations, physician raters were able to judge very reliably the presence of clinical conditions based on text reports. Once the reliability of a specific rater is confirmed, it would be possible for that rater to create a reference standard reliable enough to assess aggregate measures on a system. Six raters would be needed to create a reference standard sufficient to assess a system on a case-by-case basis. These results should help evaluators design future information extraction studies for natural language processors and other knowledge-based systems.

Original languageEnglish (US)
Pages (from-to)143-150
Number of pages8
JournalJournal of the American Medical Informatics Association
Volume6
Issue number2
DOIs
StatePublished - 1999

ASJC Scopus subject areas

  • Health Informatics

Fingerprint Dive into the research topics of 'A reliability study for evaluating information extraction from radiology reports'. Together they form a unique fingerprint.

Cite this