Advanced CNN models in gastric cancer diagnosis: enhancing endoscopic image analysis with deep transfer learning

Front Oncol. 2024 Sep 16:14:1431912. doi: 10.3389/fonc.2024.1431912. eCollection 2024.

Abstract

Introduction: The rapid advancement of science and technology has significantly expanded the capabilities of artificial intelligence, enhancing diagnostic accuracy for gastric cancer.

Methods: This research aims to utilize endoscopic images to identify various gastric disorders using an advanced Convolutional Neural Network (CNN) model. The Kvasir dataset, comprising images of normal Z-line, normal pylorus, ulcerative colitis, stool, and polyps, was used. Images were pre-processed and graphically analyzed to understand pixel intensity patterns, followed by feature extraction using adaptive thresholding and contour analysis for morphological values. Five deep transfer learning models-NASNetMobile, EfficientNetB5, EfficientNetB6, InceptionV3, DenseNet169-and a hybrid model combining EfficientNetB6 and DenseNet169 were evaluated using various performance metrics.

Results & discussion: For the complete images of gastric cancer, EfficientNetB6 computed the top performance with 99.88% accuracy on a loss of 0.049. Additionally, InceptionV3 achieved the highest testing accuracy of 97.94% for detecting normal pylorus, while EfficientNetB6 excelled in detecting ulcerative colitis and normal Z-line with accuracies of 98.8% and 97.85%, respectively. EfficientNetB5 performed best for polyps and stool with accuracies of 98.40% and 96.86%, respectively.The study demonstrates that deep transfer learning techniques can effectively predict and classify different types of gastric cancer at early stages, aiding experts in diagnosis and detection.

Keywords: contour features; deep learning; gastric cancer; medical images; transfer learning; ulcerative colitis.

Grants and funding

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIT). (NRF-2023R1A2C1005950).