An Automatic Deep Learning–Based Workflow for Glioblastoma Survival Prediction Using Preoperative Multimodal MR Images: A Feasibility Study

Jie Fu, Kamal Singhrao, Xinran Zhong, Yu Gao, Sharon X. Qi, Yingli Yang, Dan Ruan, John H. Lewis

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Purpose: Most radiomic studies use the features extracted from the manually drawn tumor contours for classification or survival prediction. However, large interobserver segmentation variations lead to inconsistent features and hence introduce more challenges in constructing robust prediction models. Here, we proposed an automatic workflow for glioblastoma (GBM) survival prediction based on multimodal magnetic resonance (MR) images. Methods and Materials: Two hundred eighty-five patients with glioma (210 GBM, 75 low-grade glioma) were included. One hundred sixty-three of the patients with GBM had overall survival data. Every patient had 4 preoperative MR images and manually drawn tumor contours. A 3-dimensional convolutional neural network, VGG-Seg, was trained and validated using 122 patients with glioma for automatic GBM segmentation. The trained VGG-Seg was applied to the remaining 163 patients with GBM to generate their autosegmented tumor contours. The handcrafted and deep learning (DL)–based radiomic features were extracted from the autosegmented contours using explicitly designed algorithms and a pretrained convolutional neural network, respectively. One hundred sixty-three patients with GBM were randomly split into training (n = 122) and testing (n = 41) sets for survival analysis. Cox regression models were trained to construct the handcrafted and DL-based signatures. The prognostic powers of the 2 signatures were evaluated and compared. Results: The VGG-Seg achieved a mean Dice coefficient of 0.86 across 163 patients with GBM for GBM segmentation. The handcrafted signature achieved a C-index of 0.64 (95% confidence interval, 0.55-0.73), whereas the DL-based signature achieved a C-index of 0.67 (95% confidence interval, 0.57-0.77). Unlike the handcrafted signature, the DL-based signature successfully stratified testing patients into 2 prognostically distinct groups. Conclusions: The VGG-Seg generated accurate GBM contours from 4 MR images. The DL-based signature achieved a numerically higher C-index than the handcrafted signature and significant patient stratification. The proposed automatic workflow demonstrated the potential of improving patient stratification and survival prediction in patients with GBM.

Original languageEnglish (US)
Article number100746
JournalAdvances in Radiation Oncology
Volume6
Issue number5
DOIs
StatePublished - Sep 2021

ASJC Scopus subject areas

  • Oncology
  • Radiology Nuclear Medicine and imaging

Fingerprint

Dive into the research topics of 'An Automatic Deep Learning–Based Workflow for Glioblastoma Survival Prediction Using Preoperative Multimodal MR Images: A Feasibility Study'. Together they form a unique fingerprint.

Cite this