TY - JOUR
T1 - Automatic Colorectal Cancer Screening Using Deep Learning in Spatial Light Interference Microscopy Data
AU - Zhang, Jingfang K.
AU - Fanous, Michael
AU - Sobh, Nahil
AU - Kajdacsy-Balla, Andre
AU - Popescu, Gabriel
N1 - Funding Information:
Funding: This research was funded by National Institutes of Health (R01CA238191, R01GM129709).
Publisher Copyright:
© 2022 by the authors. Licensee MDPI, Basel, Switzerland.
PY - 2022/2/17
Y1 - 2022/2/17
N2 - The surgical pathology workflow currently adopted by clinics uses staining to reveal tissue architecture within thin sections. A trained pathologist then conducts a visual examination of these slices and, since the investigation is based on an empirical assessment, a certain amount of subjectivity is unavoidable. Furthermore, the reliance on external contrast agents such as hematoxylin and eosin (H&E), albeit being well-established methods, makes it difficult to standardize color balance, staining strength, and imaging conditions, hindering automated computational analysis. In response to these challenges, we applied spatial light interference microscopy (SLIM), a label-free method that generates contrast based on intrinsic tissue refractive index signatures. Thus, we reduce human bias and make imaging data comparable across instruments and clinics. We applied a mask R-CNN deep learning algorithm to the SLIM data to achieve an automated colorectal cancer screening procedure, i.e., classifying normal vs. cancerous specimens. Our results, obtained on a tissue microarray consisting of specimens from 132 patients, resulted in 91% accuracy for gland detection, 99.71% accuracy in gland-level classification, and 97% accuracy in core-level classification. A SLIM tissue scanner accompanied by an application-specific deep learning algorithm may become a valuable clinical tool, enabling faster and more accurate assessments by pathologists.
AB - The surgical pathology workflow currently adopted by clinics uses staining to reveal tissue architecture within thin sections. A trained pathologist then conducts a visual examination of these slices and, since the investigation is based on an empirical assessment, a certain amount of subjectivity is unavoidable. Furthermore, the reliance on external contrast agents such as hematoxylin and eosin (H&E), albeit being well-established methods, makes it difficult to standardize color balance, staining strength, and imaging conditions, hindering automated computational analysis. In response to these challenges, we applied spatial light interference microscopy (SLIM), a label-free method that generates contrast based on intrinsic tissue refractive index signatures. Thus, we reduce human bias and make imaging data comparable across instruments and clinics. We applied a mask R-CNN deep learning algorithm to the SLIM data to achieve an automated colorectal cancer screening procedure, i.e., classifying normal vs. cancerous specimens. Our results, obtained on a tissue microarray consisting of specimens from 132 patients, resulted in 91% accuracy for gland detection, 99.71% accuracy in gland-level classification, and 97% accuracy in core-level classification. A SLIM tissue scanner accompanied by an application-specific deep learning algorithm may become a valuable clinical tool, enabling faster and more accurate assessments by pathologists.
KW - spatial light interference microscopy
KW - label-free
KW - mask R-CNN
KW - deep learning
KW - automated colorectal cancer screening
KW - Deep learning
KW - Spatial light interference microscopy
KW - Automated colorectal cancer screening
KW - Label-free
KW - Mask R-CNN
UR - http://www.scopus.com/inward/record.url?scp=85124981851&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85124981851&partnerID=8YFLogxK
U2 - 10.3390/cells11040716
DO - 10.3390/cells11040716
M3 - Article
C2 - 35203365
SN - 2073-4409
VL - 11
JO - Cells
JF - Cells
IS - 4
M1 - 716
ER -