Crowdsourced Assessment of Inanimate Biotissue Drills: A Valid and Cost-Effective Way to Evaluate Surgical Trainees

Mary Joe K. Rice, Mazen S. Zenati, Stephanie M. Novak, Amr I. Al Abbas, Amer H. Zureikat, Herbert J. Zeh, Melissa E. Hogg

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

OBJECTIVE: Providing feedback to surgical trainees is a critical component for assessment of technical skills, yet remains costly and time consuming. We hypothesize that statistical selection can identify a homogenous group of nonexpert crowdworkers capable of accurately grading inanimate surgical video. DESIGN: Applicants auditioned by grading 9 training videos using the Objective Structured Assessment of Technical Skills (OSATS) tool and an error-based checklist. The summed OSATS, summed errors, and OSATS summary score were tested for outliers using Cronbach's Alpha and single measure intraclass correlation. Accepted crowdworkers then submitted grades for videos in 3 different compositions: full video 1× speed, full video 2× speed, and critical section segmented video. Graders were blinded to this study and a similar statistical analysis was performed. SETTING: The study was conducted at the University of Pittsburgh Medical Center (Pittsburgh, PA), a tertiary care academic teaching hospital. PARTICIPANTS: Thirty-six premedical students participated as crowdworker applicants and 2 surgery experts were compared as the gold-standard. RESULTS: The selected hire intraclass correlation was 0.717 for Total Errors and 0.794 for Total OSATS for the first hire group and 0.800 for Total OSATS and 0.654 for Total Errors for the second hire group. There was very good correlation between full videos at 1× and 2× speed with an interitem statistic of 0.817 for errors and 0.86 for OSATS. Only moderate correlation was found with critical section segments. In 1 year 275hours of inanimate video was graded costing $22.27/video or $1.03/minute. CONCLUSIONS: Statistical selection can be used to identify a homogenous cohort of crowdworkers used for grading trainees’ inanimate drills. Crowdworkers can distinguish OSATS metrics and errors in full videos at 2× speed but were less consistent with segmented videos. The program is a comparatively cost-effective way to provide feedback to surgical trainees.

Original languageEnglish (US)
Pages (from-to)814-823
Number of pages10
JournalJournal of Surgical Education
Volume76
Issue number3
DOIs
StatePublished - May 1 2019

Keywords

  • Biotissue
  • CS
  • Crowdsource
  • GJ
  • HJ
  • HPB
  • IHJ
  • OSATS
  • Objective Structured Assessment of Technical Skills
  • PJ
  • Practice-Based Learning and Improvement
  • Robotic surgery
  • Surgical education
  • critical section
  • gastrojejunostomy
  • hepaticojejunostomy
  • hepatobiliary
  • interrupted hepaticojejunostomy
  • pancreaticojejunostomy

ASJC Scopus subject areas

  • Surgery
  • Education

Fingerprint

Dive into the research topics of 'Crowdsourced Assessment of Inanimate Biotissue Drills: A Valid and Cost-Effective Way to Evaluate Surgical Trainees'. Together they form a unique fingerprint.

Cite this