Minireviews Open Access
Copyright ©The Author(s) 2020. Published by Baishideng Publishing Group Inc. All rights reserved.
World J Gastroenterol. Sep 28, 2020; 26(36): 5408-5419
Published online Sep 28, 2020. doi: 10.3748/wjg.v26.i36.5408
Artificial intelligence in gastric cancer: Application and future perspectives
Peng-Hui Niu, Lu-Lu Zhao, Dong-Bing Zhao, Ying-Tai Chen, Department of Pancreatic and Gastric Surgery, National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing 100021, China
Hong-Liang Wu, Department of Anesthesiology, National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing 100021, China
ORCID number: Peng-Hui Niu (0000-0003-0114-1625); Lu-Lu Zhao (0000-0001-8344-0498); Hong-Liang Wu (0000-0002-3757-8610); Dong-Bing Zhao (0000-0002-3011-5277); Ying-Tai Chen (0000-0003-4980-6315).
Author contributions: Niu PH, Zhao LL, and Wu HL contributed equally to this work; All authors made substantial contributions to the intellectual content of this paper.
Supported by National Key R&D Program of China, No. 2017YFC0908300.
Conflict-of-interest statement: All the authors have no conflict of interest related to the manuscript.
Open-Access: This article is an open-access article that was selected by an in-house editor and fully peer-reviewed by external reviewers. It is distributed in accordance with the Creative Commons Attribution NonCommercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/
Corresponding author: Ying-Tai Chen, MD, Professor, National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, No.17 Panjiayuan Nanli, Beijing 100021, China. yingtaichen@126.com
Received: May 24, 2020
Peer-review started: May 24, 2020
First decision: July 29, 2020
Revised: August 2, 2020
Accepted: August 29, 2020
Article in press: August 29, 2020
Published online: September 28, 2020

Abstract

Gastric cancer is the fourth leading cause of cancer-related mortality across the globe, with a 5-year survival rate of less than 40%. In recent years, several applications of artificial intelligence (AI) have emerged in the gastric cancer field based on its efficient computational power and learning capacities, such as image-based diagnosis and prognosis prediction. AI-assisted diagnosis includes pathology, endoscopy, and computerized tomography, while researchers in the prognosis circle focus on recurrence, metastasis, and survival prediction. In this review, a comprehensive literature search was performed on articles published up to April 2020 from the databases of PubMed, Embase, Web of Science, and the Cochrane Library. Thereby the current status of AI-applications was systematically summarized in gastric cancer. Moreover, future directions that target this field were also analyzed to overcome the risk of overfitting AI models and enhance their accuracy as well as the applicability in clinical practice.

Key Words: Gastric cancer, Image-based diagnosis, Prognosis prediction, Artificial intelligence, Machine learning, Deep learning

Core Tip: Recently, several applications of artificial intelligence have emerged in the gastric cancer field based on its efficient computational power and learning capacities, such as image-based diagnosis and prognosis prediction. In this review, we searched the relevant works published up to April 2020 from the databases of PubMed, Embase, Web of Science, and the Cochrane Library, thus comprehensively summarizing the current status of artificial intelligence applications in gastric cancer. In addition, challenges and future directions that target the field are also discussed to improve the accuracy and applicability of artificial intelligence models in clinical practice.



INTRODUCTION

Gastric cancer has long been believed to be an aggressive malignancy with a 5-year survival rate of less than 40%[1]. Despite the decrease in incidence and mortality over the past few decades in some countries, gastric cancer is still the sixth most common malignancy and remains the fourth leading cause of cancer-related deaths across the globe[2-4]. Due to the early atypical symptoms of gastric cancer and its advanced aggressive behaviors, reducing recurrence and prolonging survival are increasingly dependent on advanced screening, diagnosis, treatment, prognosis prediction, and other new technologies. Artificial intelligence (AI), with its efficient computational power and learning capacity, has caught considerable attention in the field of gastric cancer.

Contrary to human intelligence, AI is the intelligence displayed by machines, which first emerged in 1956. The term “artificial intelligence” was commonly used to describe machines (or computers) that imitate human “cognitive” functions (e.g., learning and problem solving) related to human thinking[5]. As a subset of AI, machine learning (ML) can be defined as the computer algorithms that can automatically improve through experience[6]. Based on training data, the learner utilizes ML algorithms to build models with which the predictions or decisions can be made without explicit programming. In the last few years, ML algorithms, including random forest and support vector machines (SVM), were applied in various domains, especially in medicine. As of 2020, in the field of ML, deep learning (DL) had become the primary approach adopted in much ongoing work. DL is a type of ML algorithm that uses multiple layers to extract higher-level features from the original input gradually. Briefly, ML is a significant branch of AI, and DL is performed to implement ML. Recent developments of efficient hardware and computational power led to several AI models emerging in the field of gastric cancer[7-43]. AI-assisted diagnosis mainly included pathology, endoscopy, and computed tomography (CT)[7-35], while related researchers in prognosis focused on recurrence, metastasis, and survival prediction[36-43].

In this review, we searched the relevant works published up to April 2020 from the databases of PubMed, Embase, Web of Science, and the Cochrane Library, thus comprehensively summarizing the current status of AI-applications in gastric cancer. In addition, challenges and future directions that target the field were also discussed to improve the accuracy and applicability of AI-models in clinical practice.

AI IN THE DIAGNOSIS OF GASTRIC CANCER

Gastric cancer was mostly diagnosed at advanced stages because of their latent and nonspecific symptoms, which led to poor prognosis. It was reported that early accurate detection of gastric cancer could increase the overwhelming 5-year survival rate by approximately 90%[44,45]. However, an early gastric cancer diagnosis was mainly limited to the number of experienced imaging experts. Furthermore, the diagnostic accuracy largely depended on the clinical experience of experts and was vulnerable to multiple factors. It is impossible for qualified experts to avoid all misdiagnoses and missed diagnoses. AI methods, which imitated human cognitive function via a computer, were adept at processing and analyzing large amounts of data and thus could assist gastroenterologists in clinical diagnosis and decision making. To date, AI has been applied in many medical imaging fields, such as endoscopy, pathology as well as CT imaging. AI-assisted endoscopic diagnosis included the extraction of image features[7,8], the detection of early gastric cancer[9-14], the detection of precancerous conditions[15], the optimization of magnifying endoscopy with narrow-band imaging (M-NBI)[16-19] and the application of Raman endoscopy[20,21]. AI-assisted pathologic diagnosis involved the automatic identification of gastric cancer[22], the detection of gastric cancer based on the whole slide imaging (WSI)[23-26], the automatic detection of tumor-infiltrating lymphocytes (TILs)[27] and the segmentation of lesion regions[28-31], while AI-assisted CT diagnosis focused on the identification of preoperative peritoneal metastasis[32], the detection of perigastric metastatic lymph nodes[33] and two other new imaging techniques[34,35]. Under certain conditions, the diagnostic performance of these AI models was not inferior to human experts.

AI-assisted diagnosis in endoscopy

Endoscopy has played an essential role in the detection of gastric cancer because it enables endoscopists to observe cancerous sites directly. Accurate diagnosis of early gastric cancer by using endoscopic images is an urgent need to improve the patient’s poor prognosis. However, recent studies showed that the detection accuracy of conventional endoscopy only ranged from 69% to 79%[46]. Due to the heavy workload of medical image analysis, it was inevitable for experienced endoscopists to run into misdiagnosis and missed diagnosis. Recent efforts in endoscopy focused on adopting AI techniques to enhance the inspection and diagnosis of gastric cancer (summarized in Table 1).

Table 1 Applications of artificial intelligence in endoscopy based on different study population.
Ref.YearCountry/regionNumber of casesStudy populationMethodsResults
Liu et al[7]2016China400 imagesHospitalJDPCAAUCs (0.9532), accuracy (90.75%)
Ali et al[8]2018Pakistan176 imagesPublic images datasetG2LCMAUC (0.91), accuracy (87%)
Luo et al[9]2019China1036496 imagesHospitalGRAIDSAccuracy (up to 97.7%)
Sakai et al[10]2018Japan29037 imagesHospitalCNNAccuracy (87.6%)
Yoon et al[11]2019South Korea11539 imagesHospitalVGG modelAUCs (0.981 for detection), AUCs (0.851 for depth prediction)
Nakahira et al[12]2019Japan107284 imagesCancer InstituteDeep neural networkKappa value (0.27)
Zhu et al[13]2019China993 imagesHospitalCNN-CAD systemAUCs (0.94), accuracy (89.16%)
Wang et al[14]2019China104864 imagesHospitalMCNNSensitivity (79.622%), specificity (78.48%)
Guimarães et al[15]2020Germany270 imagesMedical centerDLAUCs (0.98), accuracy (93%)
Miyaki et al[16]2015Japan100 casesHospitalSVMAverage output value (0.846 ± 0.220)
Liu et al[17]2018China1120 M-NBI images/3068 imagesHospitalDeep CNNTop accuracy (98.5%)
Horiuchi et al[18]2019Japan2828 imagesHospitalCNNAccuracy (85.3%)
Li et al[19]2019China2088 imagesHospitalCNNAccuracy (90.91%)
Bergholt et al[20]2011Singapore1063 in vivo Raman spectraHospitalACO-LDA algorithmsSensitivity (94.6%), specificity (94.6%)
Duraipandian et al[21]2012Singapore2748 in vivo Raman spectraHospitalPLS-DA algorithmsAccuracy (85.6%), specificity (86.2%)

The key to obtaining high detection accuracy lies in the extraction of discriminative features, which can significantly distinguish the lesion images from standard images. Liu et al[7] designed an algorithm of ML, which was called joint diagonalization principal component analysis for the dimension reduction of endoscopic images. Then, a novel AI-assisted method was presented to detect early images of gastric cancer by combining joint diagonalization principal component analysis and conventional algorithms without learning, which revealed a better performance than traditional related methods. Ali et al[8] presented a novel texture extraction method named Gabor-based gray level co-occurrence matrix to detect the abnormal frames from the whole chromoendoscopy sequence. Then, the authors combined the SVM classifier and Gabor-based gray level co-occurrence matrix texture features to screen for early gastric cancer. The detection accuracy, specificity, sensitivity and the area under the curve (AUC) value were 87%, 82%, 91% and 0.91, respectively, which were higher than those results obtained by the SVM classifier combined with other texture extraction methods.

A study conducted by Luo et al[9] constructed the Gastrointestinal Artificial Intelligence Diagnostic System to detect upper gastrointestinal cancer in real time automatically. They used 1036496 endoscopy images with standard white light from 84424 cases across China for training and testing. In the different large-scale validation and prospective sets, the diagnostic accuracy was satisfactory as it ranged from 0.915 to 0.977. Moreover, the experimental results demonstrated that the Gastrointestinal Artificial Intelligence Diagnostic System attained sensitivity comparable to that of the human experts (0.942 vs 0.945). Sakai et al[10] introduced a convolutional neural network (CNN)-based automatic detection model with high accuracy and believed that the model could enhance the diagnostic capabilities of endoscopists. Yoon et al[11] adopted a lesion-based model for the accurate detection and depth prediction of early gastric cancer and evaluated the significant factors associated with AI-assisted diagnosis.

Nakahira et al[12] described that the analysis system of AI-assisted endoscopic images could effectively stratify the risk of gastric cancer and further evaluated the consistency of the AI model with the consensus diagnoses of three endoscopists. Zhu et al[13] also constructed a CNN computer-aided detection system to determine the invasion depth of early gastric cancer. Their proposed model achieved substantially higher specificity and accuracy compared to endoscopists with an AUC of 0.94. Another study suggested a multicolumn CNN to improve gastric cancer screening[14]. The novelty of their method lies in combining electronic gastroscopy with the analysis tools of cloud-based endoscopic images. Experimental results revealed that the proposed multicolumn CNN method dramatically outperformed the other CNN models (including AlexNet[47], GoogLeNet[48] and VGGNet[49]) and non-DL methods (including kNN[50] and SVM-based[51] classifiers).

Chronic atrophic gastritis is a common precancerous gastric condition that might lead to the appearance and development of gastric cancer[52]. Conventional endoscopy had high variability among different endoscopists for distinguishing of precancerous conditions. Therefore, Guimarães et al[15] developed and trained a DL approach by using 200 real-world endoscopic images to diagnose atrophic gastritis. The DL model achieved a diagnostic accuracy of 93% and an AUC of 0.98, outperforming the combined results of expert endoscopists.

Given the rapid advances of narrow-band imaging, magnifying endoscopy with M-NBI had been given great attention to the diagnosis in early gastric cancer, which showed a more conspicuous accuracy than that by general white light imaging[53]. However, interobserver variability was a limitation for the diagnosis of the lesions using M-NBI, and it was difficult for endoscopists to master the diagnostic technology in a short time[54]. Given that the advance of AI may offer such a solution, several studies related to AI-assisted image diagnosis have emerged in recent years. Miyaki et al[16] developed the SVM system to quantitatively identify gastric cancer based on magnifying endoscopy with blue-laser imagery. Liu et al[17] first applied transfer learning of fine-tuning deep CNN features to classify the gastric mucosal lesions of M-NBI images. Horiuchi et al[18] and Li et al[19] adopted a CNN system to boost the capability to distinguish early gastric cancer from noncancerous lesions efficiently and obtained excellent diagnostic performances. Their results suggested that the diagnostic sensitivity of the CNN model with M-NBI was superior compared to that of endoscopists.

Furthermore, a previous report suggested that M-NBI was still inadequate to effectively and accurately diagnose grossly invisible lesions due to the lack of sufficient biochemical information[55]. Raman spectroscopy, as a novel point-wise spectroscopic technique, could comprehensively display the surface and subsurface cellular structures from diagnostic tissue. It was suggested that Raman endoscopy had the promising potential for the diagnosis of early gastric cancer. Bergholt et al[20] first combined the real-time Raman endoscopy with AI-based algorithms to distinguish neoplastic and normal gastric tissues. Later, another study devised an automated Raman spectroscopy diagnostic framework named PLS-DA algorithms to detect gastric cancer with a diagnostic accuracy of 85.6%[21].

AI-assisted diagnosis in pathology

The traditional diagnosis of gastric cancers was to identify morphological features of the malignant cells by using histopathological biopsy specimens, and manual pathological inspection of gastric slices was time-consuming and laborious. The need for automatic image analysis and histological classifications of gastric cancer has been increasing. Li et al[22] proposed a novel DL-based framework, called GastricNet, to automatically identify gastric cancers. The classification accuracy of the proposed framework was 100% on gastric pathological slices, which was substantially higher than other existing networks, including DenseNet[56] and ResNet[57].

In addition, the WSI, as a virtual counterpart of glass slides[58], was considered to be comparable to optical microscopy for the diagnosis of gastric cancers. The advances of WSI led to the emergence of several AI applications in pathological diagnosis. Sharma et al[23] described that the CNN architecture could efficiently analyze pathological images with an accuracy of 0.6990 for cancer classification and 0.8144 for necrosis detection. Leon et al[24] assessed the application of deep CNN in the automatic detection of gastric cancer pathological images. Two approaches based on deep CNN were presented: One was performed to analyze the morphological features from the whole images, while the other independently investigated the local characteristic properties. Experiment results showed an average accuracy of up to 89.72%, which demonstrated the excellent performance of the proposed model in the detection of gastric cancers. Iizuka et al[25] trained CNNs and recurrent neural networks to distinguish stomach adenocarcinoma, adenoma and non-neoplastic. On three independent test sets of biopsy histopathology WSI, DL applications achieved AUCs up to 0.97 for the classification of gastric adenocarcinoma. Yoshida et al[26] compared the classification results of experienced pathologists with that of the e-Pathologist constructed by NEC Corporation. Although the overall concordance rate between the two methods was only 55.6% (1702/3062), the concordance rate for the negative biopsy specimens was as high as 90.6% (1033/1140). Furthermore, current evidence revealed that TILs were associated with the prognosis of gastric cancer[59]. A related study presented the CNN model to automatically detect TILs on histopathological WSI with an acceptable accuracy of 96.88%[27].

The automatic segmentation of lesion regions was a challenge in the AI-assisted pathological diagnosis of gastric cancer. To alleviate the shortage of well-annotated pathology image data, Liang et al[28] firstly applied the DL method to segment the pathological images. They presented a new neural network architecture and algorithm named as an overlapped region forecast for the detection of gastric cancers. An intersection over union coefficient (IoU) of 88.3% and 91.1% accuracy indicated that the model had reached the standard of supervised learning. Qu et al[29] presented a novel type of intermediate dataset and developed a stepwise fine-tuning-based scheme to improve the classification performance of deep neural networks. Sun et al[30] also demonstrated that the proposed DL model was a powerful image segmentation tool with 91.60% for the mean accuracy and 82.65% for the mean IoU. Another study also demonstrated that the Mask R-CNN model was an effective method to target the field of medical image segmentation[31].

Hence, these satisfactory results of AI-assisted applications highlighted the enormous potential benefit to help the pathologists and support the pathological detection of gastric cancer, especially for improving the efficiency of image segmentation and reducing the diagnostic time (summarized in Table 2).

Table 2 Applications of artificial intelligence in pathology and computerized tomography based on different study population.
Ref.YearCountry/regionNumber of casesStudy populationMethodsResults
Li et al[22]2018China700 slicesPublicly gastric slice datasetGastricNetAccuracy (100%)
Sharma et al[23]2017Germany454 casesHospitalCNNAccuracy (0.6990 for cancer classification), accuracy (0.8144 for necrosis detection)
Leon et al[24]2019Colombia40 imagesDepartment of pathologyDeep CNNAccuracy (up to 89.72%)
Iizuka et al[25]2020Japan1746 biopsy histopathology WSIsHospital, TCGACNN, RNNAUCs (up to 0.98), accuracy (95.6%)
Yoshida et al[26]2018Japan3062 gastric biopsy specimensCancer centerMLOverall concordance rate (55.6%)
Garcia et al[27]2017Peru3257 images-Deep CNNAccuracy (96.88%)
Liang et al[28]2019China1900 images-DLIoU (0.883), accuracy (91.09%)
Qu et al[29]2018Japan9720 images/19440 imagesHospitalDLAUCs (up to 0.965)
Sun et al[30]2019China500 pathological imagesHospitalDLIoU (0.8265), accuracy (91.60%)
Cao et al[31]2019China1399 pathological sections-the Mask R-CNNAP value (61.2)
AI-assisted diagnosis in CT imaging

Attributable to the noninvasiveness and convenience, CT was widely used for the clinical diagnosis of gastric cancer[60,61]. However, the diagnostic accuracy was mainly dependent on the clinical experience of radiologists. When interpreting large amounts of CT images, the radiologist’s diagnostic accuracy would inevitably decrease, and errors were more prone to occur. Several approaches based on ML and DL reported that they could effectively extract valuable information on CT images (summarized in Table 3). For example, Huang et al[32] applied the DL method on diagnostic analysis and created a deep CNN model to identify the preoperative peritoneal metastasis in advanced gastric cancer.

Table 3 Applications of artificial intelligence in computerized tomography based on different study population.
Ref.YearCountry/regionNumber of casesStudy populationMethodsResults
Huang et al[32]2020China-HospitalDeep CNN-
Gao et al[33]2019China32495 imagesHospitalFR-CNNAUCs (0.9541)
Li et al[34]2015China26 casesHospitalKNN algorithmAccuracy (76.92%)
Li et al[35]2012China38 lymph node datasetsHospitalMLAccuracy (96.33%)

Compared to the current status of poor CT depiction in lymph node metastasis and low detection sensitivity, the novel DL-based model was expected to obtain an excellent performance of CT imaging. Gao et al[33] developed and validated faster region-based CNN based on CT images. The experimental results showed that faster region-based CNN obtained a high accuracy for the diagnosis of perigastric metastatic lymph nodes with the mean average precision value of 0.7801 and AUC value of 0.9541.

Moreover, dual-energy spectral CT (DEsCT), as a new imaging technique, could easily switch between high-energy and low-energy datasets, which enabled the precise creation of virtual images based on monochromatic spectra. The recent improvement of DEsCT made it available for routine clinical practice. However, it was difficult for radiologists to take full advantage of more quantitative data obtained by the DEsCT system. A recent study introduced the AI-assisted utility of DEsCT imaging for the stages and characteristics of gastric cancer[34]. The authors used a new multiple instance learning method to determine the invasion depth of gastric cancer and achieved a gross accuracy of 0.7692 after optimization. Also, it was reported that gemstone spectral imaging could provide more valuable image information to radiologists than conventional CT[62]. Li et al[35] proposed the ML-based gemstone spectral imaging analysis for lymph node metastasis in gastric cancer, which achieved a higher accuracy of detection. The feasibility and the effectiveness of gemstone spectral imaging-CT to diagnose lymph node metastasis outperformed traditional detection methods, such as endoscopic ultrasound and the multidetector-row CT.

AI IN PROGNOSIS PREDICTION OF GASTRIC CANCER

Accurate prognosis prediction of gastric cancer was of significance for both clinicians and patients. Such information could assist clinicians in decision-making and improve management over patients. It was appreciated that the demographics, pathological indicators, physiological states and even social contacts had an impact on the prognosis of gastric cancer patients. However, conventional statistical methods, such as the tumor–node–metastasis staging system and nomogram, could hardly analyze the complicated internal connections among these characteristics. Based on its excellent computational power and integration capability, AI models had been applied to improve the survival rates of gastric cancer patients.

In the last few years, the applications of AI in prognosis involved the predictions of survival time[36-38], recurrence risk[39,40] and metastasis[41-43] (summarized in Table 4). Jiang et al[36] applied SVM to survival analysis and developed a prognostic classifier. The results showed a higher predictive accuracy of overall survival and disease-free survival than the tumor–node–metastasis staging system defined by the American Joint Committee on Cancer. Besides, the proposed gastric cancer-SVM classifier was also used to predict adjuvant chemotherapeutic benefit, which was able to facilitate the individualized treatment for gastric cancer. Combining demographics, pathological indicators and physiological characteristics of 939 cases, Lu et al[37] created a novel multimodal hypergraph learning framework to improve the accuracy of survival prediction. The result showed that the proposed approach outperformed random forest and SVM in overall survival prediction. Another study compared the value of artificial neural network and Bayesian neural networks (BNN) in survival prediction of gastric cancer patients, and the findings indicated BNN was superior to the artificial neural network method[38].

Table 4 Applications of artificial intelligence in gastric cancer prognosis based on different study population.
Ref.YearCountry/regionNumber of casesStudy populationMethodsResults
Jiang et al[36]2018China786 casesHospitalSVM classifierAUCs (up to 0.834)
Lu et al[37]2017China939 patientsHospitalMMHGAccuracy (69.28%)
Korhani Kangi et al[38]2018Iran339 patientsHospitalANN, BNNSensitivity (88.2% for ANN, 90.3% for BNN), specificity (95.4% for ANN, 90.9% for BNN)
Zhang et al[39]2019China669 casesHospitalMLAUCs (up to 0.831)
Liu et al[40]2018China432 GC tissue samplesHospitalSVM classifierAccuracy (up to 94.19%)
Bollschweiler et al[41]2004Germany, Japan135 casesCancer centerANNAccuracy (93%)
Hensler et al[42]2005Germany, Japan4302 casesCancer centerQUEEN techniqueAccuracy (72.73%)
Jagric et al[43]2010Slovenia213 casesClinical centerLearning vector quantization neural networksSensitivity (71%), specificity (96.1%)

Recurrence was one of the leading causes of death for gastric cancer patients[63], thereby the accurate evaluation of recurrence risk was relevant in routine clinical work. Recent reports indicated that the AI-assisted recurrence prediction system achieved better performances than traditional statistical methods. Zhang et al[39] used ML methods to extract radiomic signatures from CT images of 669 consecutive patients diagnosed with advanced gastric cancer. Then they constructed a CT-based radiomic model to predict the recurrent risk of advanced gastric cancer. Liu et al[40] trained the SVM classifier to predict the recurrence in patients with gastric cancer. Using the gene expression profiling dataset GSE26253[64], they discovered that a set of feature genes (including PLCG1, PRKACA and TGFBR1) potentially correlated with gastric cancer recurrence.

Lymph node metastasis was a significant prognostic indicator for gastric cancer[65]. The lack of accurate methods to predict gastric cancer metastasis has led to the application of AI-assisted prediction techniques to evaluate the metastasis risk better. Bollschweiler et al[41] demonstrated that artificial neural networks could broadly enhance the predictive accuracy of lymph node metastasis. Hensler et al[42] proposed a novel artificial neural network approach for the preoperative prediction of lymph node metastasis. Compared with the Maruyama Diagnostic System developed at the National Cancer Center in Tokyo, the proposed model showed higher accuracy and better reliability. Also, it was shown that liver metastases could severely diminish the long-term survival of gastric cancer patients. Jagric et al[43] presented a learning vector quantization networks to predict postoperative liver metastasis in patients suffering from gastric cancer and obtained a reasonably high predictive value.

CHALLENGES AND FUTURE PERSPECTIVES

Despite the reported great success of AI in medical image-based diagnosis and prognosis prediction, several barriers must be removed before widespread clinical practice occurs.

A flexible AI model requires a large amount of well-annotated data for training, validating and testing, while the related research with small sample sizes is prone to have measurement errors[66]. With advances in medical-based imaging, such as endoscopy and pathology, numerous data is generated continuously to help physicians in clinical diagnosis and decision making. However, such data are rarely labeled or annotated, which are not suitable for algorithm training. Hence, the availability of high-quality data is a significant challenge for the development and optimization of AI. A meaningful way to access these qualified data sets is to establish large-scale open-access databases. Moreover, existing data resources should also be utilized effectively. Single hospitals and institutions are encouraged to share validated data to improve the applicability of AI in gastric cancer, which is similar to previous research related to Alzheimer’s disease[67].

An additional hurdle to the improvement of robust algorithms for gastric cancer is the interpretability of AI. In some studies, the applications of ML and DL revealed higher sensitivity and fewer false positives than radiologists[68,69]. However, they also inevitably ran into the risk of overfitting, leading to a tradeoff between accuracy and interpretability. In addition, the “black box” feature of algorithms may cause clinician’s suspicion of ML applications. Cabitza et al[70] offered that the “black box” of ML may bring the unintended negative consequences in clinical practice. Fortunately, recent advances in data visualization tools deepened the visual understanding of algorithm decision making[71] thus contributing to the promotion of the optimization algorithms and widespread clinical acceptance.

Given its advantages of computational power and learning capacity, AI will appear in various gastric cancer fields. Increasingly, it is appreciated that the characteristics of diseases, the physiological, psychological states of patients and even social communication have an impact on the prognosis of gastric cancer patients. It is difficult for physicians to integrate complex data manually. An AI model is adept at integrating much information from the vast majority of data, which has the potential to reduce the workload of clinicians substantially. However, due to some ethical and safety issues, the predictions that are generated by AI require further evaluation and interpretation by professional physicians. Thereby, AI techniques will not wholly replace physicians in future clinical practice and combining human beings with AI can achieve the ideal state of higher efficiency.

CONCLUSION

AI techniques, especially ML and DL, are making remarkable progress in the field of gastric cancer. The current status and future perspectives of AI-assisted diagnosis and prognosis were comprehensively introduced in this review. Numerous related researchers reported the impressive performance of AI, which was superior to the standard statistical methods. Despite several limitations and hurdles that exist in AI, such as the lack of well-annotated data and the interpretability of models, based on its efficient computational power and learning competence, AI will revolutionize the diagnosis and prognosis of gastric cancer in the foreseeable future.

Footnotes

Manuscript source: Invited manuscript

Specialty type: Gastroenterology and hepatology

Country/Territory of origin: China

Peer-review report’s scientific quality classification

Grade A (Excellent): 0

Grade B (Very good): B, B, B

Grade C (Good): 0

Grade D (Fair): 0

Grade E (Poor): 0

P-Reviewer: Ishizawa K, Kinami S, Srivastava M S-Editor: Gao CC L-Editor: Filipodia P-Editor: Wang LL

References
1.  Jemal A, Ward EM, Johnson CJ, Cronin KA, Ma J, Ryerson B, Mariotto A, Lake AJ, Wilson R, Sherman RL, Anderson RN, Henley SJ, Kohler BA, Penberthy L, Feuer EJ, Weir HK. Annual Report to the Nation on the Status of Cancer, 1975-2014, Featuring Survival. J Natl Cancer Inst. 2017;109:djx030.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 748]  [Cited by in F6Publishing: 982]  [Article Influence: 140.3]  [Reference Citation Analysis (0)]
2.  Islami F, DeSantis CE, Jemal A. Incidence Trends of Esophageal and Gastric Cancer Subtypes by Race, Ethnicity, and Age in the United States, 1997-2014. Clin Gastroenterol Hepatol. 2019;17:429-439.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 61]  [Cited by in F6Publishing: 62]  [Article Influence: 12.4]  [Reference Citation Analysis (0)]
3.  Steevens J, Botterweck AA, Dirx MJ, van den Brandt PA, Schouten LJ. Trends in incidence of oesophageal and stomach cancer subtypes in Europe. Eur J Gastroenterol Hepatol. 2010;22:669-678.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 33]  [Cited by in F6Publishing: 72]  [Article Influence: 5.1]  [Reference Citation Analysis (0)]
4.  Ferlay J, Colombet M, Soerjomataram I, Mathers C, Parkin DM, Piñeros M, Znaor A, Bray F. Estimating the global cancer incidence and mortality in 2018: GLOBOCAN sources and methods. Int J Cancer. 2019;144:1941-1953.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 3585]  [Cited by in F6Publishing: 4274]  [Article Influence: 712.3]  [Reference Citation Analysis (1)]
5.  Russel S, Norvig P. Artificial Intelligence: A Modern Approach. 2th ed.  Pearson Education, 2003.  [PubMed]  [DOI]  [Cited in This Article: ]
6.  Christian R. Machine Learning, a Probabilistic Perspective. Chance. 2014;27:62-63.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 161]  [Cited by in F6Publishing: 78]  [Article Influence: 7.8]  [Reference Citation Analysis (0)]
7.  Liu DY, Gan T, Rao NN, Xing YW, Zheng J, Li S, Luo CS, Zhou ZJ, Wan YL. Identification of lesion images from gastrointestinal endoscope based on feature extraction of combinational methods with and without learning process. Med Image Anal. 2016;32:281-294.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 58]  [Cited by in F6Publishing: 39]  [Article Influence: 4.9]  [Reference Citation Analysis (0)]
8.  Ali H, Yasmin M, Sharif M, Rehmani MH. Computer assisted gastric abnormalities detection using hybrid texture descriptors for chromoendoscopy images. Comput Methods Programs Biomed. 2018;157:39-47.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 31]  [Cited by in F6Publishing: 25]  [Article Influence: 4.2]  [Reference Citation Analysis (0)]
9.  Luo H, Xu G, Li C, He L, Luo L, Wang Z, Jing B, Deng Y, Jin Y, Li Y, Li B, Tan W, He C, Seeruttun SR, Wu Q, Huang J, Huang DW, Chen B, Lin SB, Chen QM, Yuan CM, Chen HX, Pu HY, Zhou F, He Y, Xu RH. Real-time artificial intelligence for detection of upper gastrointestinal cancer by endoscopy: a multicentre, case-control, diagnostic study. Lancet Oncol. 2019;20:1645-1654.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 155]  [Cited by in F6Publishing: 209]  [Article Influence: 41.8]  [Reference Citation Analysis (0)]
10.  Sakai Y, Takemoto S, Hori K, Nishimura M, Ikematsu H, Yano T, Yokota H. Automatic detection of early gastric cancer in endoscopic images using a transferring convolutional neural network. Conf Proc IEEE Eng Med Biol Soc. 2018;2018:4138-4141.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 39]  [Cited by in F6Publishing: 46]  [Article Influence: 9.2]  [Reference Citation Analysis (0)]
11.  Yoon HJ, Kim S, Kim JH, Keum JS, Oh SI, Jo J, Chun J, Youn YH, Park H, Kwon IG, Choi SH, Noh SH. A Lesion-Based Convolutional Neural Network Improves Endoscopic Detection and Depth Prediction of Early Gastric Cancer. J Clin Med. 2019;8:1310.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 61]  [Cited by in F6Publishing: 73]  [Article Influence: 14.6]  [Reference Citation Analysis (0)]
12.  Nakahira H, Ishihara R, Aoyama K, Kono M, Fukuda H, Shimamoto Y, Nakagawa K, Ohmori M, Iwatsubo T, Iwagami H, Matsuno K, Inoue S, Matsuura N, Shichijo S, Maekawa A, Kanesaka T, Yamamoto S, Takeuchi Y, Higashino K, Uedo N, Matsunaga T, Tada T. Stratification of gastric cancer risk using a deep neural network. JGH Open. 2020;4:466-471.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 12]  [Cited by in F6Publishing: 12]  [Article Influence: 2.4]  [Reference Citation Analysis (0)]
13.  Zhu Y, Wang QC, Xu MD, Zhang Z, Cheng J, Zhong YS, Zhang YQ, Chen WF, Yao LQ, Zhou PH, Li QL. Application of convolutional neural network in the diagnosis of the invasion depth of gastric cancer based on conventional endoscopy. Gastrointest Endosc. 2019;89:806-815.e1.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 201]  [Cited by in F6Publishing: 195]  [Article Influence: 39.0]  [Reference Citation Analysis (0)]
14.  Wang H, Ding S, Wu DS, Zhang YT, Yang SL. Smart connected electronic gastroscope system for gastric cancer screening using multi-column convolutional neural networks. Int J Prod Res. 2019;57:6795-6806.  [PubMed]  [DOI]  [Cited in This Article: ]
15.  Guimarães P, Keller A, Fehlmann T, Lammert F, Casper M. Deep-learning based detection of gastric precancerous conditions. Gut. 2020;69:4-6.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 58]  [Cited by in F6Publishing: 63]  [Article Influence: 15.8]  [Reference Citation Analysis (0)]
16.  Miyaki R, Yoshida S, Tanaka S, Kominami Y, Sanomura Y, Matsuo T, Oka S, Raytchev B, Tamaki T, Koide T, Kaneda K, Yoshihara M, Chayama K. A computer system to be used with laser-based endoscopy for quantitative diagnosis of early gastric cancer. J Clin Gastroenterol. 2015;49:108-115.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 48]  [Cited by in F6Publishing: 42]  [Article Influence: 4.7]  [Reference Citation Analysis (0)]
17.  Liu XQ, Wang CL, Hu Y, Zeng Z, Bai JY, Liao GB. Transfer learning with convolutional neural network for early gastric cancer classification on magnifiying narrow-band imaging images. ICIP 2018: Proceedings of the 25th IEEE International Conference on Image Processing; 2018 Oct 07-10; Athens, Greece.  New York: IEEE, 2018: 1388-1392.  [PubMed]  [DOI]  [Cited in This Article: ]
18.  Horiuchi Y, Aoyama K, Tokai Y, Hirasawa T, Yoshimizu S, Ishiyama A, Yoshio T, Tsuchida T, Fujisaki J, Tada T. Convolutional Neural Network for Differentiating Gastric Cancer from Gastritis Using Magnified Endoscopy with Narrow Band Imaging. Dig Dis Sci. 2020;65:1355-1363.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 70]  [Cited by in F6Publishing: 78]  [Article Influence: 19.5]  [Reference Citation Analysis (1)]
19.  Li L, Chen Y, Shen Z, Zhang X, Sang J, Ding Y, Yang X, Li J, Chen M, Jin C, Chen C, Yu C. Convolutional neural network for the diagnosis of early gastric cancer based on magnifying narrow band imaging. Gastric Cancer. 2020;23:126-132.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 105]  [Cited by in F6Publishing: 112]  [Article Influence: 28.0]  [Reference Citation Analysis (0)]
20.  Bergholt MS, Zheng W, Lin K, Ho KY, Teh M, Yeoh KG, Yan So JB, Huang Z. In vivo diagnosis of gastric cancer using Raman endoscopy and ant colony optimization techniques. Int J Cancer. 2011;128:2673-2680.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 77]  [Cited by in F6Publishing: 80]  [Article Influence: 5.7]  [Reference Citation Analysis (0)]
21.  Duraipandian S, Sylvest Bergholt M, Zheng W, Yu Ho K, Teh M, Guan Yeoh K, Bok Yan So J, Shabbir A, Huang Z. Real-time Raman spectroscopy for in vivo, online gastric cancer diagnosis during clinical endoscopic examination. J Biomed Opt. 2012;17:081418.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 95]  [Cited by in F6Publishing: 78]  [Article Influence: 6.5]  [Reference Citation Analysis (0)]
22.  Li YX, Li XC, Xie XP, Shen LL. Deep learning based gastric cancer identification. ISBI 2018: Proceedings of the 15th IEEE International Symposium on Biomedical Imaging; 2018 Apr 04-07; Washington, DC.  New York: IEEE, 2018: 182-185.  [PubMed]  [DOI]  [Cited in This Article: ]
23.  Sharma H, Zerbe N, Klempert I, Hellwich O, Hufnagl P. Deep convolutional neural networks for automatic classification of gastric carcinoma using whole slide images in digital histopathology. Comput Med Imaging Graph. 2017;61:2-13.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 176]  [Cited by in F6Publishing: 148]  [Article Influence: 21.1]  [Reference Citation Analysis (0)]
24.  Leon F, Gelvez M, Jaimes Z, Gelvez T, Arguello H. Supervised Classification of Histopathological Images Using Convolutional Neuronal Networks for Gastric Cancer Detection. STSIVA 2019: Proceedings of the 22nd Symposium on Image, Signal Processing and Artificial Vision; 2019 Apr 24-26; Bucaramanga, Colombia.  New York: IEEE, 2019.  [PubMed]  [DOI]  [Cited in This Article: ]
25.  Iizuka O, Kanavati F, Kato K, Rambeau M, Arihiro K, Tsuneki M. Deep Learning Models for Histopathological Classification of Gastric and Colonic Epithelial Tumours. Sci Rep. 2020;10:1504.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 132]  [Cited by in F6Publishing: 161]  [Article Influence: 40.3]  [Reference Citation Analysis (0)]
26.  Yoshida H, Shimazu T, Kiyuna T, Marugame A, Yamashita Y, Cosatto E, Taniguchi H, Sekine S, Ochiai A. Automated histological classification of whole-slide images of gastric biopsy specimens. Gastric Cancer. 2018;21:249-257.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 63]  [Cited by in F6Publishing: 62]  [Article Influence: 10.3]  [Reference Citation Analysis (0)]
27.  Garcia E, Hermoza R, Castanon CB, Cano L, Castillo M, Castaneda C. Automatic Lymphocyte Detection on Gastric Cancer IHC Images using Deep Learning. CBMS 2017: Proceedings of the 30th IEEE International Symposium on Computer-Based Medical Systems; 2017 Jun 22-24; Thessaloniki, Greece.  New York: IEEE, 2017: 200-204.  [PubMed]  [DOI]  [Cited in This Article: ]
28.  Liang Q, Nan Y, Coppola G, Zou K, Sun W, Zhang D, Wang Y, Yu G. Weakly Supervised Biomedical Image Segmentation by Reiterative Learning. IEEE J Biomed Health Inform. 2019;23:1205-1214.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 30]  [Cited by in F6Publishing: 24]  [Article Influence: 4.0]  [Reference Citation Analysis (0)]
29.  Qu J, Hiruta N, Terai K, Nosato H, Murakawa M, Sakanashi H. Gastric Pathology Image Classification Using Stepwise Fine-Tuning for Deep Neural Networks. J Healthc Eng. 2018;2018:8961781.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 43]  [Cited by in F6Publishing: 34]  [Article Influence: 5.7]  [Reference Citation Analysis (0)]
30.  Sun MY, Zhang GH, Dang H, Qi XQ, Zhou XG, Chang Q. Accurate Gastric Cancer Segmentation in Digital Pathology Images Using Deformable Convolution and Multi-Scale Embedding Networks. IEEE Access. 2019;7:75530-75541.  [PubMed]  [DOI]  [Cited in This Article: ]
31.  Cao GT, Song WL, Zhao ZW. Gastric Cancer Diagnosis with Mask R-CNN. IHMSC 2019: Proceedings of the 11th International Conference on Intelligent Human-Machine Systems and Cybernetics; 2019 Aug 24-25; Hangzhou, China.  New York: IEEE, 2019: 60-63.  [PubMed]  [DOI]  [Cited in This Article: ]
32.  Huang Z, Liu D, Chen X, Yu P, Wu J, Song B, Hu J, Wu B. Retrospective imaging studies of gastric cancer: Study protocol clinical trial (SPIRIT Compliant). Medicine (Baltimore). 2020;99:e19157.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 12]  [Cited by in F6Publishing: 10]  [Article Influence: 2.5]  [Reference Citation Analysis (0)]
33.  Gao Y, Zhang ZD, Li S, Guo YT, Wu QY, Liu SH, Yang SJ, Ding L, Zhao BC, Li S, Lu Y. Deep neural network-assisted computed tomography diagnosis of metastatic lymph nodes from gastric cancer. Chin Med J (Engl). 2019;132:2804-2811.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 35]  [Cited by in F6Publishing: 34]  [Article Influence: 6.8]  [Reference Citation Analysis (0)]
34.  Li C, Shi C, Zhang H, Chen Y, Zhang S. Multiple instance learning for computer aided detection and diagnosis of gastric cancer with dual-energy CT imaging. J Biomed Inform. 2015;57:358-368.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 20]  [Cited by in F6Publishing: 22]  [Article Influence: 2.4]  [Reference Citation Analysis (0)]
35.  Li C, Zhang S, Zhang H, Pang L, Lam K, Hui C, Zhang S. Using the K-nearest neighbor algorithm for the classification of lymph node metastasis in gastric cancer. Comput Math Methods Med. 2012;2012:876545.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 39]  [Cited by in F6Publishing: 49]  [Article Influence: 4.1]  [Reference Citation Analysis (0)]
36.  Jiang Y, Xie J, Han Z, Liu W, Xi S, Huang L, Huang W, Lin T, Zhao L, Hu Y, Yu J, Zhang Q, Li T, Cai S, Li G. Immunomarker Support Vector Machine Classifier for Prediction of Gastric Cancer Survival and Adjuvant Chemotherapeutic Benefit. Clin Cancer Res. 2018;24:5574-5584.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 71]  [Cited by in F6Publishing: 80]  [Article Influence: 13.3]  [Reference Citation Analysis (0)]
37.  Lu F, Chen ZK, Yuan X, Li Q, Du ZD, Luo L, Zhang FY. MMHG: Multi-modal Hypergraph Learning for Overall Survival After D2 Gastrectomy for Gastric Cancer. DASC/PiCom/DataCom/CyberSciTech 2017: Proceedings of the 15th Intl Conf on Dependable, Autonomic and Secure Computing, 15th Intl Conf on Pervasive Intelligence and Computing, 3rd Intl Conf on Big Data Intelligence and Computing and Cyber Science and Technology Congress; 2017 Nov 6-10; Orlando, FL, USA.  California: IEEE Computer Society, 2017: 164-9.  [PubMed]  [DOI]  [Cited in This Article: ]
38.  Korhani Kangi A, Bahrampour A. Predicting the Survival of Gastric Cancer Patients Using Artificial and Bayesian Neural Networks. Asian Pac J Cancer Prev. 2018;19:487-490.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in F6Publishing: 14]  [Reference Citation Analysis (0)]
39.  Zhang W, Fang M, Dong D, Wang X, Ke X, Zhang L, Hu C, Guo L, Guan X, Zhou J, Shan X, Tian J. Development and validation of a CT-based radiomic nomogram for preoperative prediction of early recurrence in advanced gastric cancer. Radiother Oncol. 2020;145:13-20.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 38]  [Cited by in F6Publishing: 80]  [Article Influence: 20.0]  [Reference Citation Analysis (0)]
40.  Liu B, Tan J, Wang X, Liu X. Identification of recurrent risk-related genes and establishment of support vector machine prediction model for gastric cancer. Neoplasma. 2018;65:360-366.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 8]  [Cited by in F6Publishing: 7]  [Article Influence: 1.4]  [Reference Citation Analysis (0)]
41.  Bollschweiler EH, Mönig SP, Hensler K, Baldus SE, Maruyama K, Hölscher AH. Artificial neural network for prediction of lymph node metastases in gastric cancer: a phase II diagnostic study. Ann Surg Oncol. 2004;11:506-511.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 46]  [Cited by in F6Publishing: 51]  [Article Influence: 2.6]  [Reference Citation Analysis (0)]
42.  Hensler K, Waschulzik T, Mönig SP, Maruyama K, Hölscher AH, Bollschweiler E. Quality-assured Efficient Engineering of Feedforward Neural Networks (QUEEN) -- pretherapeutic estimation of lymph node status in patients with gastric carcinoma. Methods Inf Med. 2005;44:647-654.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 5]  [Cited by in F6Publishing: 4]  [Article Influence: 0.2]  [Reference Citation Analysis (0)]
43.  Jagric T, Potrc S, Jagric T. Prediction of liver metastases after gastric cancer resection with the use of learning vector quantization neural networks. Dig Dis Sci. 2010;55:3252-3261.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 9]  [Cited by in F6Publishing: 10]  [Article Influence: 0.7]  [Reference Citation Analysis (0)]
44.  Amin MB, Greene FL, Edge SB, Compton CC, Gershenwald JE, Brookland RK, Meyer L, Gress DM, Byrd DR, Winchester DP. The Eighth Edition AJCC Cancer Staging Manual: Continuing to build a bridge from a population-based to a more “personalized” approach to cancer staging. CA Cancer J Clin. 2017;67:93-99.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 2341]  [Cited by in F6Publishing: 3176]  [Article Influence: 453.7]  [Reference Citation Analysis (2)]
45.  Sano T, Coit DG, Kim HH, Roviello F, Kassab P, Wittekind C, Yamamoto Y, Ohashi Y. Proposal of a new stage grouping of gastric cancer for TNM classification: International Gastric Cancer Association staging project. Gastric Cancer. 2017;20:217-225.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 246]  [Cited by in F6Publishing: 302]  [Article Influence: 43.1]  [Reference Citation Analysis (0)]
46.  Choi J, Kim SG, Im JP, Kim JS, Jung HC, Song IS. Comparison of endoscopic ultrasonography and conventional endoscopy for prediction of depth of tumor invasion in early gastric cancer. Endoscopy. 2010;42:705-713.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 156]  [Cited by in F6Publishing: 159]  [Article Influence: 11.4]  [Reference Citation Analysis (0)]
47.  Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with deep convolutional neural networks. Commun ACM. 2017;60:84-90.  [PubMed]  [DOI]  [Cited in This Article: ]
48.  Szegedy C, Vanhoucke V, Ioffe S, Shlens J, Wojna Z. Rethinking the Inception Architecture for Computer Vision. CVPR 2016: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2016 JUN 27-30; Seattle, WA, USA.  New York: IEEE, 2016: 2818-2826.  [PubMed]  [DOI]  [Cited in This Article: ]
49.  Simonyan K, Zisserman A. Very Deep Convolutional Networks for Large-Scale Image Recognition; 2014. Preprint. [cited 10 April 2015].  Available from: 1409.3301.  [PubMed]  [DOI]  [Cited in This Article: ]
50.  Ding JR, Cheng HD, Xian M, Zhang YT, Xu F. Local-weighted Citation-kNN algorithm for breast ultrasound image classification. Optik. 2015;126:5188-5193.  [PubMed]  [DOI]  [Cited in This Article: ]
51.  Shen D, Wu G, Suk HI. Deep Learning in Medical Image Analysis. Annu Rev Biomed Eng. 2017;19:221-248.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 2581]  [Cited by in F6Publishing: 1627]  [Article Influence: 232.4]  [Reference Citation Analysis (0)]
52.  Correa P, Piazuelo MB. The gastric precancerous cascade. J Dig Dis. 2012;13:2-9.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 277]  [Cited by in F6Publishing: 304]  [Article Influence: 15.2]  [Reference Citation Analysis (0)]
53.  Pimentel-Nunes P, Dinis-Ribeiro M, Soares JB, Marcos-Pinto R, Santos C, Rolanda C, Bastos RP, Areia M, Afonso L, Bergman J, Sharma P, Gotoda T, Henrique R, Moreira-Dias L. A multicenter validation of an endoscopic classification with narrow band imaging for gastric precancerous and cancerous lesions. Endoscopy. 2012;44:236-246.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 102]  [Cited by in F6Publishing: 107]  [Article Influence: 8.9]  [Reference Citation Analysis (0)]
54.  Nakanishi H, Doyama H, Ishikawa H, Uedo N, Gotoda T, Kato M, Nagao S, Nagami Y, Aoyagi H, Imagawa A, Kodaira J, Mitsui S, Kobayashi N, Muto M, Takatori H, Abe T, Tsujii M, Watari J, Ishiyama S, Oda I, Ono H, Kaneko K, Yokoi C, Ueo T, Uchita K, Matsumoto K, Kanesaka T, Morita Y, Katsuki S, Nishikawa J, Inamura K, Kinjo T, Yamamoto K, Yoshimura D, Araki H, Kashida H, Hosokawa A, Mori H, Yamashita H, Motohashi O, Kobayashi K, Hirayama M, Kobayashi H, Endo M, Yamano H, Murakami K, Koike T, Hirasawa K, Miyaoka Y, Hamamoto H, Hikichi T, Hanabata N, Shimoda R, Hori S, Sato T, Kodashima S, Okada H, Mannami T, Yamamoto S, Niwa Y, Yashima K, Tanabe S, Satoh H, Sasaki F, Yamazato T, Ikeda Y, Nishisaki H, Nakagawa M, Matsuda A, Tamura F, Nishiyama H, Arita K, Kawasaki K, Hoppo K, Oka M, Ishihara S, Mukasa M, Minamino H, Yao K. Evaluation of an e-learning system for diagnosis of gastric lesions using magnifying narrow-band imaging: a multicenter randomized controlled study. Endoscopy. 2017;49:957-967.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 44]  [Cited by in F6Publishing: 47]  [Article Influence: 6.7]  [Reference Citation Analysis (0)]
55.  Mirabal YN, Chang SK, Atkinson EN, Malpica A, Follen M, Richards-Kortum R. Reflectance spectroscopy for in vivo detection of cervical precancer. J Biomed Opt. 2002;7:587-594.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 128]  [Cited by in F6Publishing: 81]  [Article Influence: 3.7]  [Reference Citation Analysis (0)]
56.  Huang G, Liu Z, van der Maaten L. Weinberger KQ. Densely Connected Convolutional Networks. CVPR 2017: Proceedings of the 30th IEEE/CVF Conference on Computer Vision and Pattern Recognition; 2017 Jul 21-26; Honolulu, HI, USA.  New York: IEEE, 2017: 2261-2269.  [PubMed]  [DOI]  [Cited in This Article: ]
57.  He K, Zhang X, Ren S, Sun J. Deep Residual Learning for Image Recognition. CVPR 2016: IEEE Conference on Computer Vision & Pattern Recognition. 2016 Jun 27-30; Seattle, WA, United States.  New York: IEEE, 2016: 770-778.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 72655]  [Cited by in F6Publishing: 17274]  [Article Influence: 2159.3]  [Reference Citation Analysis (0)]
58.  Mukhopadhyay S, Feldman MD, Abels E, Ashfaq R, Beltaifa S, Cacciabeve NG, Cathro HP, Cheng L, Cooper K, Dickey GE, Gill RM, Heaton RP, Kerstens R, Lindberg GM, Malhotra RK, Mandell JW, Manlucu ED, Mills AM, Mills SE, Moskaluk CA, Nelis M, Patil DT, Przybycin CG, Reynolds JP, Rubin BP, Saboorian MH, Salicru M, Samols MA, Sturgis CD, Turner KO, Wick MR, Yoon JY, Zhao P, Taylor CR. Whole Slide Imaging Versus Microscopy for Primary Diagnosis in Surgical Pathology: A Multicenter Blinded Randomized Noninferiority Study of 1992 Cases (Pivotal Study). Am J Surg Pathol. 2018;42:39-52.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 213]  [Cited by in F6Publishing: 214]  [Article Influence: 30.6]  [Reference Citation Analysis (0)]
59.  Amedei A, Della Bella C, Silvestri E, Prisco D, D'Elios MM. T cells in gastric cancer: friends or foes. Clin Dev Immunol. 2012;2012:690571.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 25]  [Cited by in F6Publishing: 31]  [Article Influence: 2.6]  [Reference Citation Analysis (0)]
60.  Wang FH, Shen L, Li J, Zhou ZW, Liang H, Zhang XT, Tang L, Xin Y, Jin J, Zhang YJ, Yuan XL, Liu TS, Li GX, Wu Q, Xu HM, Ji JF, Li YF, Wang X, Yu S, Liu H, Guan WL, Xu RH. The Chinese Society of Clinical Oncology (CSCO): clinical guidelines for the diagnosis and treatment of gastric cancer. Cancer Commun (Lond). 2019;39:10.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 211]  [Cited by in F6Publishing: 287]  [Article Influence: 57.4]  [Reference Citation Analysis (0)]
61.  Muro K, Van Cutsem E, Narita Y, Pentheroudakis G, Baba E, Li J, Ryu MH, Zamaniah WIW, Yong WP, Yeh KH, Kato K, Lu Z, Cho BC, Nor IM, Ng M, Chen LT, Nakajima TE, Shitara K, Kawakami H, Tsushima T, Yoshino T, Lordick F, Martinelli E, Smyth EC, Arnold D, Minami H, Tabernero J, Douillard JY. Pan-Asian adapted ESMO Clinical Practice Guidelines for the management of patients with metastatic gastric cancer: a JSMO-ESMO initiative endorsed by CSCO, KSMO, MOS, SSO and TOS. Ann Oncol. 2019;30:19-33.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 150]  [Cited by in F6Publishing: 144]  [Article Influence: 28.8]  [Reference Citation Analysis (0)]
62.  Chandra N, Langan D A.  Gemstone Detector: Dual Energy Imaging via Fast kVp Switching. In: Johnson T, Fink C, Schönberg S, Reiser M. (eds) Dual Energy CT in Clinical Practice. Medical Radiology. Springer, Berlin, Heidelberg 2011; 35-41.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 23]  [Cited by in F6Publishing: 23]  [Article Influence: 1.8]  [Reference Citation Analysis (0)]
63.  Li F, Zhang R, Liang H, Liu H, Quan J. The pattern and risk factors of recurrence of proximal gastric cancer after curative resection. J Surg Oncol. 2013;107:130-135.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 28]  [Cited by in F6Publishing: 30]  [Article Influence: 2.5]  [Reference Citation Analysis (0)]
64.  Lee J, Sohn I, Do IG, Kim KM, Park SH, Park JO, Park YS, Lim HY, Sohn TS, Bae JM, Choi MG, Lim DH, Min BH, Lee JH, Rhee PL, Kim JJ, Choi DI, Tan IB, Das K, Tan P, Jung SH, Kang WK, Kim S. Nanostring-based multigene assay to predict recurrence for gastric cancer patients after surgery. PLoS One. 2014;9:e90133.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 78]  [Cited by in F6Publishing: 87]  [Article Influence: 8.7]  [Reference Citation Analysis (0)]
65.  Siewert JR, Böttcher K, Stein HJ, Roder JD. Relevant prognostic factors in gastric cancer: ten-year results of the German Gastric Cancer Study. Ann Surg. 1998;228:449-461.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 797]  [Cited by in F6Publishing: 784]  [Article Influence: 30.2]  [Reference Citation Analysis (0)]
66.  Loken E, Gelman A. Measurement error and the replication crisis. Science. 2017;355:584-585.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 306]  [Cited by in F6Publishing: 236]  [Article Influence: 33.7]  [Reference Citation Analysis (0)]
67.  Jack CR, Bernstein MA, Fox NC, Thompson P, Alexander G, Harvey D, Borowski B, Britson PJ, L Whitwell J, Ward C, Dale AM, Felmlee JP, Gunter JL, Hill DL, Killiany R, Schuff N, Fox-Bosetti S, Lin C, Studholme C, DeCarli CS, Krueger G, Ward HA, Metzger GJ, Scott KT, Mallozzi R, Blezek D, Levy J, Debbins JP, Fleisher AS, Albert M, Green R, Bartzokis G, Glover G, Mugler J, Weiner MW. The Alzheimer's Disease Neuroimaging Initiative (ADNI): MRI methods. J Magn Reson Imaging. 2008;27:685-691.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1782]  [Cited by in F6Publishing: 1780]  [Article Influence: 111.3]  [Reference Citation Analysis (0)]
68.  Chartrand G, Cheng PM, Vorontsov E, Drozdzal M, Turcotte S, Pal CJ, Kadoury S, Tang A. Deep Learning: A Primer for Radiologists. Radiographics. 2017;37:2113-2131.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 511]  [Cited by in F6Publishing: 598]  [Article Influence: 99.7]  [Reference Citation Analysis (0)]
69.  Lan K, Wang DT, Fong S, Liu LS, Wong KKL, Dey N. A Survey of Data Mining and Deep Learning in Bioinformatics. J Med Syst. 2018;42:139.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 138]  [Cited by in F6Publishing: 81]  [Article Influence: 13.5]  [Reference Citation Analysis (0)]
70.  Cabitza F, Rasoini R, Gensini GF. Unintended Consequences of Machine Learning in Medicine. JAMA. 2017;318:517-518.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 455]  [Cited by in F6Publishing: 411]  [Article Influence: 58.7]  [Reference Citation Analysis (0)]
71.  Selvaraju RR, Cogswell M, Das A, Vedantam R, Parikh D, Batra D. Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization. ICCV 2017: Proceedings of the 16th IEEE International Conference on Computer Vision; 2017 Oct 22-29; Venice, Italy.  New York: IEEE, 2017: 618-626.  [PubMed]  [DOI]  [Cited in This Article: ]