TY - JOUR
T1 - Deep learning-level melanoma detection by interpretable machine learning and imaging biomarker cues
AU - Gareau, Daniel S.
AU - Browning, James
AU - Correa Da Rosa, Joel
AU - Suarez-Farinas, Mayte
AU - Lish, Samantha
AU - Zong, Amanda M.
AU - Firester, Benjamin
AU - Vrattos, Charles
AU - Renert-Yuval, Yael
AU - Gamboa, Mauricio
AU - Vallone, María G.
AU - Barragán-Estudillo, Zamira F.
AU - Tamez-Peña, Alejandra L.
AU - Montoya, Javier
AU - Jesús-Silva, Miriam A.
AU - Carrera, Cristina
AU - Malvehy, Josep
AU - Puig, Susana
AU - Marghoob, Ashfaq
AU - Carucci, John A.
AU - Krueger, James G.
N1 - Publisher Copyright:
© The Authors(s) 2020.
PY - 2020/11/1
Y1 - 2020/11/1
N2 - Significance: Melanoma is a deadly cancer that physicians struggle to diagnose early because they lack the knowledge to differentiate benign from malignant lesions. Deep machine learning approaches to image analysis offer promise but lack the transparency to be widely adopted as stand-alone diagnostics. Aim: We aimed to create a transparent machine learning technology (i.e., not deep learning) to discriminate melanomas from nevi in dermoscopy images and an interface for sensory cue integration. Approach: Imaging biomarker cues (IBCs) fed ensemble machine learning classifier (Eclass) training while raw images fed deep learning classifier training. We compared the areas under the diagnostic receiver operator curves. Results: Our interpretable machine learning algorithm outperformed the leading deep-learning approach 75% of the time. The user interface displayed only the diagnostic imaging biomarkers as IBCs. Conclusions: From a translational perspective, Eclass is better than convolutional machine learning diagnosis in that physicians can embrace it faster than black box outputs. Imaging biomarkers cues may be used during sensory cue integration in clinical screening. Our method may be applied to other image-based diagnostic analyses, including pathology and radiology.
AB - Significance: Melanoma is a deadly cancer that physicians struggle to diagnose early because they lack the knowledge to differentiate benign from malignant lesions. Deep machine learning approaches to image analysis offer promise but lack the transparency to be widely adopted as stand-alone diagnostics. Aim: We aimed to create a transparent machine learning technology (i.e., not deep learning) to discriminate melanomas from nevi in dermoscopy images and an interface for sensory cue integration. Approach: Imaging biomarker cues (IBCs) fed ensemble machine learning classifier (Eclass) training while raw images fed deep learning classifier training. We compared the areas under the diagnostic receiver operator curves. Results: Our interpretable machine learning algorithm outperformed the leading deep-learning approach 75% of the time. The user interface displayed only the diagnostic imaging biomarkers as IBCs. Conclusions: From a translational perspective, Eclass is better than convolutional machine learning diagnosis in that physicians can embrace it faster than black box outputs. Imaging biomarkers cues may be used during sensory cue integration in clinical screening. Our method may be applied to other image-based diagnostic analyses, including pathology and radiology.
KW - diagnostic application
KW - imaging biomarkers
KW - machine learning
KW - sensory cue integration
KW - skin cancer classification
UR - http://www.scopus.com/inward/record.url?scp=85096948055&partnerID=8YFLogxK
U2 - 10.1117/1.JBO.25.11.112906
DO - 10.1117/1.JBO.25.11.112906
M3 - Article
C2 - 33247560
AN - SCOPUS:85096948055
SN - 1083-3668
VL - 25
JO - Journal of Biomedical Optics
JF - Journal of Biomedical Optics
IS - 11
M1 - 112906
ER -