Please use this identifier to cite or link to this item:

Title: Deep Convolutional Neural Networks Detect Tumor Genotype from Pathological Tissue Images in Gastrointestinal Stromal Tumors
Authors: 羅崇銘
Lo, Chung-Ming
Liang, Cher-Wei
Fang, Pei-Wei
Huang , Hsuan-Ying
Contributors: 圖檔所
Keywords: KIT;PDGFRA;deep convolutional neural network;gastrointestinal stromal tumor;machine learning.
Date: 2021-11
Issue Date: 2022-04-14 15:26:22 (UTC+8)
Abstract: Gastrointestinal stromal tumors (GIST) are common mesenchymal tumors, and their effective treatment depends upon the mutational subtype of the KIT/PDGFRA genes. We established deep convolutional neural network (DCNN) models to rapidly predict drug-sensitive mutation subtypes from images of pathological tissue. A total of 5153 pathological images of 365 different GISTs from three different laboratories were collected and divided into training and validation sets. A transfer learning mechanism based on DCNN was used with four different network architectures, to identify cases with drug-sensitive mutations. The accuracy ranged from 87% to 75%. Cross-institutional inconsistency, however, was observed. Using gray-scale images resulted in a 7% drop in accuracy (accuracy 80%, sensitivity 87%, specificity 73%). Using images containing only nuclei (accuracy 81%, sensitivity 87%, specificity 73%) or cytoplasm (accuracy 79%, sensitivity 88%, specificity 67%) produced 6% and 8% drops in accuracy rate, respectively, suggesting buffering effects across subcellular components in DCNN interpretation. The proposed DCNN model successfully inferred cases with drug-sensitive mutations with high accuracy. The contribution of image color and subcellular components was also revealed. These results will help to generate a cheaper and quicker screening method for tumor gene testing.
Relation: Cancers, pp.13, 5787
Data Type: article
DOI 連結:
Appears in Collections:[圖書資訊與檔案學研究所] 期刊論文

Files in This Item:

File Description SizeFormat
9.pdf796KbAdobe PDF10View/Open

All items in 學術集成 are protected by copyright, with all rights reserved.

社群 sharing