Clinician collaboration to improve clinical decision support: The Clickbusters initiative

Allison B. McCoy, Elise M. Russo, Kevin B. Johnson, Bobby Addison, Neal Patel, Jonathan P. Wanderer, Dara E. Mize, Jon G. Jackson, Thomas J. Reese, Sylinda Littlejohn, Lorraine Patterson, Tina French, Debbie Preston, Audra Rosenbury, Charlie Valdez, Scott D. Nelson, Chetan V. Aher, Mhd Wael Alrifai, Jennifer Andrews, Cheryl CobbSara N. Horst, David P. Johnson, Lindsey A. Knake, Adam A. Lewis, Laura Parks, Sharidan K. Parr, Pratik Patel, Barron L. Patterson, Christine M. Smith, Krystle D. Suszter, Robert W. Turer, Lyndy J. Wilcox, Aileen P. Wright, Adam Wright

Research output: Contribution to journalArticlepeer-review

12 Scopus citations

Abstract

Objective: We describe the Clickbusters initiative implemented at Vanderbilt University Medical Center (VUMC), which was designed to improve safety and quality and reduce burnout through the optimization of clinical decision support (CDS) alerts. Materials and Methods: We developed a 10-step Clickbusting process and implemented a program that included a curriculum, CDS alert inventory, oversight process, and gamification. We carried out two 3-month rounds of the Clickbusters program at VUMC. We completed descriptive analyses of the changes made to alerts during the process, and of alert firing rates before and after the program. Results: Prior to Clickbusters, VUMC had 419 CDS alerts in production, with 488 425 firings (42 982 interruptive) each week. After 2 rounds, the Clickbusters program resulted in detailed, comprehensive reviews of 84 CDS alerts and reduced the number of weekly alert firings by more than 70 000 (15.43%). In addition to the direct improvements in CDS, the initiative also increased user engagement and involvement in CDS. Conclusions: At VUMC, the Clickbusters program was successful in optimizing CDS alerts by reducing alert firings and resulting clicks. The program also involved more users in the process of evaluating and improving CDS and helped build a culture of continuous evaluation and improvement of clinical content in the electronic health record.

Original languageEnglish (US)
Pages (from-to)1050-1059
Number of pages10
JournalJournal of the American Medical Informatics Association
Volume29
Issue number6
DOIs
StatePublished - Jun 1 2022
Externally publishedYes

Keywords

  • clinical
  • decision support systems
  • electronic health records
  • evaluation study
  • quality improvement
  • user engagement

ASJC Scopus subject areas

  • Health Informatics

Fingerprint

Dive into the research topics of 'Clinician collaboration to improve clinical decision support: The Clickbusters initiative'. Together they form a unique fingerprint.

Cite this