Automatic and near real-time stylistic behavior assessment in robotic surgery

M. Ershad, Robert V Rege, Ann Majewicz Fey

Research output: Contribution to journalArticle

Abstract

Purpose: Automatic skill evaluation is of great importance in surgical robotic training. Extensive research has been done to evaluate surgical skill, and a variety of quantitative metrics have been proposed. However, these methods primarily use expert selected features which may not capture latent information in movement data. In addition, these features are calculated over the entire task time and are provided to the user after the completion of the task. Thus, these quantitative metrics do not provide users with information on how to modify their movements to improve performance in real time. This study focuses on automatic stylistic behavior recognition that has the potential to be implemented in near real time. Methods: We propose a sparse coding framework for automatic stylistic behavior recognition in short time intervals using only position data from the hands, wrist, elbow, and shoulder. A codebook is built for each stylistic adjective using the positive and negative labels provided for each trial through crowd sourcing. Sparse code coefficients are obtained for short time intervals (0.25 s) in a trial using this codebook. A support vector machine classifier is trained and validated through tenfold cross-validation using the sparse codes from the training set. Results: The results indicate that the proposed dictionary learning method is able to assess stylistic behavior performance in near real time using user joint position data with improved accuracy compared to using PCA features or raw data. Conclusion: The possibility to automatically evaluate a trainee’s style of movement in short time intervals could provide the user with online customized feedback and thus improve performance during surgical tasks.

Fingerprint

Robotics
Glossaries
Support vector machines
Labels
Classifiers
Feedback
Crowdsourcing
Passive Cutaneous Anaphylaxis
Elbow
Robotic surgery
Wrist
Hand
Joints
Learning
Research

Keywords

  • Crowdsourcing
  • Robotic surgery
  • Surgical skill assessment

ASJC Scopus subject areas

  • Surgery
  • Biomedical Engineering
  • Radiology Nuclear Medicine and imaging
  • Computer Vision and Pattern Recognition
  • Health Informatics
  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design

Cite this

@article{1b755f444df94edd8f90e238f326a015,
title = "Automatic and near real-time stylistic behavior assessment in robotic surgery",
abstract = "Purpose: Automatic skill evaluation is of great importance in surgical robotic training. Extensive research has been done to evaluate surgical skill, and a variety of quantitative metrics have been proposed. However, these methods primarily use expert selected features which may not capture latent information in movement data. In addition, these features are calculated over the entire task time and are provided to the user after the completion of the task. Thus, these quantitative metrics do not provide users with information on how to modify their movements to improve performance in real time. This study focuses on automatic stylistic behavior recognition that has the potential to be implemented in near real time. Methods: We propose a sparse coding framework for automatic stylistic behavior recognition in short time intervals using only position data from the hands, wrist, elbow, and shoulder. A codebook is built for each stylistic adjective using the positive and negative labels provided for each trial through crowd sourcing. Sparse code coefficients are obtained for short time intervals (0.25 s) in a trial using this codebook. A support vector machine classifier is trained and validated through tenfold cross-validation using the sparse codes from the training set. Results: The results indicate that the proposed dictionary learning method is able to assess stylistic behavior performance in near real time using user joint position data with improved accuracy compared to using PCA features or raw data. Conclusion: The possibility to automatically evaluate a trainee’s style of movement in short time intervals could provide the user with online customized feedback and thus improve performance during surgical tasks.",
keywords = "Crowdsourcing, Robotic surgery, Surgical skill assessment",
author = "M. Ershad and Rege, {Robert V} and {Majewicz Fey}, Ann",
year = "2019",
month = "1",
day = "1",
doi = "10.1007/s11548-019-01920-6",
language = "English (US)",
journal = "Computer-Assisted Radiology and Surgery",
issn = "1861-6410",
publisher = "Springer Verlag",

}

TY - JOUR

T1 - Automatic and near real-time stylistic behavior assessment in robotic surgery

AU - Ershad, M.

AU - Rege, Robert V

AU - Majewicz Fey, Ann

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Purpose: Automatic skill evaluation is of great importance in surgical robotic training. Extensive research has been done to evaluate surgical skill, and a variety of quantitative metrics have been proposed. However, these methods primarily use expert selected features which may not capture latent information in movement data. In addition, these features are calculated over the entire task time and are provided to the user after the completion of the task. Thus, these quantitative metrics do not provide users with information on how to modify their movements to improve performance in real time. This study focuses on automatic stylistic behavior recognition that has the potential to be implemented in near real time. Methods: We propose a sparse coding framework for automatic stylistic behavior recognition in short time intervals using only position data from the hands, wrist, elbow, and shoulder. A codebook is built for each stylistic adjective using the positive and negative labels provided for each trial through crowd sourcing. Sparse code coefficients are obtained for short time intervals (0.25 s) in a trial using this codebook. A support vector machine classifier is trained and validated through tenfold cross-validation using the sparse codes from the training set. Results: The results indicate that the proposed dictionary learning method is able to assess stylistic behavior performance in near real time using user joint position data with improved accuracy compared to using PCA features or raw data. Conclusion: The possibility to automatically evaluate a trainee’s style of movement in short time intervals could provide the user with online customized feedback and thus improve performance during surgical tasks.

AB - Purpose: Automatic skill evaluation is of great importance in surgical robotic training. Extensive research has been done to evaluate surgical skill, and a variety of quantitative metrics have been proposed. However, these methods primarily use expert selected features which may not capture latent information in movement data. In addition, these features are calculated over the entire task time and are provided to the user after the completion of the task. Thus, these quantitative metrics do not provide users with information on how to modify their movements to improve performance in real time. This study focuses on automatic stylistic behavior recognition that has the potential to be implemented in near real time. Methods: We propose a sparse coding framework for automatic stylistic behavior recognition in short time intervals using only position data from the hands, wrist, elbow, and shoulder. A codebook is built for each stylistic adjective using the positive and negative labels provided for each trial through crowd sourcing. Sparse code coefficients are obtained for short time intervals (0.25 s) in a trial using this codebook. A support vector machine classifier is trained and validated through tenfold cross-validation using the sparse codes from the training set. Results: The results indicate that the proposed dictionary learning method is able to assess stylistic behavior performance in near real time using user joint position data with improved accuracy compared to using PCA features or raw data. Conclusion: The possibility to automatically evaluate a trainee’s style of movement in short time intervals could provide the user with online customized feedback and thus improve performance during surgical tasks.

KW - Crowdsourcing

KW - Robotic surgery

KW - Surgical skill assessment

UR - http://www.scopus.com/inward/record.url?scp=85061696201&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85061696201&partnerID=8YFLogxK

U2 - 10.1007/s11548-019-01920-6

DO - 10.1007/s11548-019-01920-6

M3 - Article

C2 - 30779023

AN - SCOPUS:85061696201

JO - Computer-Assisted Radiology and Surgery

JF - Computer-Assisted Radiology and Surgery

SN - 1861-6410

ER -