Minireviews Open Access
Copyright ©The Author(s) 2021. Published by Baishideng Publishing Group Inc. All rights reserved.
Artif Intell Cancer. Apr 28, 2021; 2(2): 12-24
Published online Apr 28, 2021. doi: 10.35713/aic.v2.i2.12
Advances in the application of artificial intelligence in solid tumor imaging
Ying Shao, Department of Laboratory Medicine, People Hospital of Jiangying, Jiangying 214400, Jiangsu Province, China
Yu-Xuan Zhang, Huan-Huan Chen, Shi-Chang Zhang, Jie-Xin Zhang, Department of Laboratory Medicine, The First Affiliated Hospital of Nanjing Medical University, Nanjing 210029, Jiangsu Province, China
Shan-Shan Lu, Department of Radiology, The First Affiliated Hospital of Nanjing Medical University, Nanjing 210029, Jiangsu Province, China
ORCID number: Ying Shao (0000-0002-5732-0246); Yu-Xuan Zhang (0000-0003-3832-0328); Huan-Huan Chen (0000-0002-7958-8657); Shan-Shan Lu (0000-0002-5438-8050); Shi-Chang Zhang (0000-0002-6587-2518); Jie-Xin Zhang (0000-0003-1407-7562).
Author contributions: Shao Y and Zhang YX performed the majority of the writing and they contributed equally to this minireview; Chen HH and Lu SS provided input in writing the paper; Zhang SC and Zhang JX designed the outline and coordinated the writing of the paper.
Supported by The “The Six Top Talent Project” of Jiangsu Province, No. WSW-004; and National Natural Science Foundation of China, No. 81671836.
Conflict-of-interest statement: The authors declare that they have no conflict of interest.
Open-Access: This article is an open-access article that was selected by an in-house editor and fully peer-reviewed by external reviewers. It is distributed in accordance with the Creative Commons Attribution NonCommercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/Licenses/by-nc/4.0/
Corresponding author: Jie-Xin Zhang, MD, PhD, Associate Professor, Senior Researcher, Department of Laboratory Medicine, The First Affiliated Hospital of Nanjing Medical University, No. 300 Guangzhou Road, Nanjing 210029, Jiangsu Province, China. jiexinzhang@njmu.edu.cn
Received: March 9, 2021
Peer-review started: March 9, 2021
First decision: March 26, 2021
Revised: April 2, 2021
Accepted: April 20, 2021
Article in press: April 20, 2021
Published online: April 28, 2021

Abstract

Early diagnosis and timely treatment are crucial in reducing cancer-related mortality. Artificial intelligence (AI) has greatly relieved clinical workloads and changed the current medical workflows. We searched for recent studies, reports and reviews referring to AI and solid tumors; many reviews have summarized AI applications in the diagnosis and treatment of a single tumor type. We herein systematically review the advances of AI application in multiple solid tumors including esophagus, stomach, intestine, breast, thyroid, prostate, lung, liver, cervix, pancreas and kidney with a specific focus on the continual improvement on model performance in imaging practice.

Key Words: Artificial intelligence, Oncology, Imaging, Model performance

Core Tip: Many reviews have summarized artificial intelligence applications in the diagnosis and treatment of a single tumor type. However, this is the first review to systematically review how artificial intelligence relieves clinical workloads and changes the current medical workflows while maintaining high quality to provide precision medicine in multiple solid tumors. Due to its clear advantage in imaging practice, patients will benefit from early diagnosis and appropriate treatment.



INTRODUCTION

Cancer is currently a worldwide health problem. Early diagnosis and timely treatment are crucial in reducing cancer-related mortality. Medical imaging is a common technique used to guide the clinical diagnosis of solid tumors. Accurate interpretation of imaging data has become an important but difficult task in the diagnosis process.

Artificial intelligence (AI) refers to an information science that researches and develops theories, methods, technologies and application systems used to simulate, expand and extend human intelligence[1]. With the rapid development of machine learning, deep learning and other crucial AI technologies in the field of image processing in recent years, these approaches have made great contributions to disease classification, prognosis prediction and therapy evaluation and can identify patterns that humans cannot recognize[2-4] (Figure 1). Here, we review the advantage of AI applications in imaging examinations of multiple solid tumors and highlight its great benefits in optimizing the clinical work process, providing accurate tumor assessment for current precision medicine and achieving better diagnosis and treatment results based on its practical data and literature reports.

Figure 1
Figure 1 A flowchart of artificial intelligence model construction. AI: Artificial intelligence.
APPLICATION OF AI IN GASTROINTESTINAL TUMORS

Gastric cancer is one of the most common gastrointestinal malignancies at present, with a poor prognosis and high mortality. Endoscopy and pathological biopsy are still the “gold standard” for the diagnosis of gastric cancer, but they have shortcomings[5]. For example, the sensitivity of endoscopic diagnosis of atrophic gastritis is only 42%, so the rate of missed diagnosis is relatively high[6]. Multipoint biopsy sampling increases the risk of tissue injury and gastrorrhagia[7,8]. Some advanced endoscopic techniques, such as color endoscopy combined with magnification endoscopy and laser confocal microscopy, can provide only images of the mucosal surface of the gastrointestinal tract[7-9]. Billah et al[10] used capsule endoscopy along with a convolutional neural network (CNN) and color wavelet features to identify gastrointestinal polyps. Urban et al[11] applied deep neural networks to identify colonic polyps from colonoscopy. Lahner et al[12] established a decision support system (DSS) for the diagnosis of atrophic gastritis without endoscopy. The diagnostic accuracy of these three protocols was above 96%, which supports the promising generalization of AI-based technologies.

Esophagus squamous cell cancer

Narrow-band imaging (NBI) is an emerging advanced, noninvasive endoscopic technology that can strengthen the evaluation of the surface structure and microvascular morphology of the esophagus and improve the accuracy rate of endoscopic diagnosis[13]. Using NBI to diagnose squamous cell carcinoma can lead to various results due to different judgments from doctors[14,15]. Fukuda et al[16] applied a deep CNN model to examine NBI endoscopy video images of squamous cell carcinoma, showing higher detection sensitivity (91.1%) than experts and high detection accuracy (88.3%). Those authors suggested that the AI system can discover tumors > 30 mm or with muscularis mucosa invasion that were missed diagnosis by experts. Compared to endoscopic experts, AI has a better diagnostic performance.

Atrophied gastritis

The CNN-chronic atrophic gastritis approach developed by Zhang et al[7] has a good classification performance for recognizing chronic atrophic gastritis based on gastric antrum images whose area under the curve (AUC) was close to 0.99. The accuracy, sensitivity and specificity of CNN-chronic atrophic gastritis in the field of atrophic gastritis diagnosis are all above 0.94. In this study, 1458 mild cases, 1348 moderate cases and 38 severe cases of atrophic gastritis were tested by the CNN model, and the accuracy rates were 0.93, 0.95 and 0.99, respectively, indicating good consistency of the CNN model recognition with the clinical diagnosis of atrophic gastritis.

However, the literature has reported that AI technology used for stomach cancer or esophageal stomach adenocarcinoma is susceptible to problems related to tumor morphology, atrophic change, uneven mucosal background, etc., which leads to low specificity and high false positive rate (FPR)[17]. Several studies indicated that the application of AI in the clinic has high accuracy. If AI technology is combined with endoscopy doctors, then endoscopy can help doctors better diagnose atrophic gastritis, increase the rate of early gastric cancer diagnosis and avoid unnecessary pathological biopsy[18,19].

Early gastric cancer

Regarding small early gastric tumors, Abe et al[18] showed that AI technology can find anomalies faster than endoscopy doctors (45.5 s vs 173.0 min), and it also shows higher sensitivity (58.4% vs 31.9%). However, the positive predictive value (PPV) and specificity of AI technology were relatively lower than those of endoscopy doctors (26.0% vs 46.2% and 87.3% vs 97.0%, respectively)[18]. A computer-aided design (CAD) system is used in stationary images of magnifying endoscopy combined with NBI, which have an accuracy rate for early gastric cancer diagnosis of 85.3%[20]. When endoscopy cannot identify and capture images of lesions, magnifying endoscopy combined with NBI video in the CAD system can help the real-time clinical diagnosis of early gastric cancer. Horiuchi et al[19] proposed that the diagnostic performance of the CAD system using magnifying endoscopy combined with NBI video is equal to or better than that of 11 experienced endoscopic experts in early gastric cancer. The AUC was 0.8684, and its accuracy, sensitivity, specificity, PPV and negative predictive value were 85.1%, 87.4%, 82.8%, 83.5% and 86.7%, respectively[19].

Colorectal cancer

Colorectal colonoscopy is the key technique for the diagnosis of colorectal polyps. However, several studies have shown that 15.4% of colorectal lesions (≤ 3 mm) were diagnosed as adenomas under endoscopy but were judged as normal mucosa via pathological examination[21]. Intraobserver and interobserver discrepancies are the main problem[22]. Therefore, some studies have suggested that using AI techniques combined with endoscopy and imaging may help physicians identify colorectal lesions and perform pathological classification and prognosis prediction[22].

Shahidi et al[21] established a real-time AI-based clinical DSS to assess the differences between results from endoscopy and pathology in lesions ≤ 3 mm. Of the 644 lesions, 458 lesions reached agreement, while significant differences were found in 99 cases (adenoma under endoscopy but normal mucosa by pathologic examination). When using the clinical DSS for further evaluation, they found that the clinical DSS data of 90 cases conformed to those from endoscopy (coincidence rate was 90.9%), supporting AI objectivity prior to pathological examination and interpretation[21]. Yang et al[22] proposed a CNN model whose diagnosis accuracy was better than or similar to that of endoscopic experts (71.5% vs 67.5%), and applications that support the CNN model can help endoscopic physicians identify colorectal lesions to reduce the misdiagnosis rate. The CNN model can also extend the discrimination ability between advanced colorectal cancer and noncancerous lesions, helping endoscopy doctors choose the best treatment strategy effectively[22]. Randomized clinical trials are needed to determine if the CNN model applied to real-time endoscopic video can help endoscopic doctors detect tiny or negligible lesions in the examination.

Wang et al[23] explored the feasibility of faster region-based CNN technology. They used transfer learning technology and images and features of the ImageNet VGG16 model to automatically identify the positive circumferential resection margin in high-resolution magnetic resonance imaging (MRI) of rectal cancer, and the accuracy, sensitivity and specificity were 93.2%, 83.8% and 95.6%, respectively[23]. The use of 18F fluorodeoxyglucose-positron emission tomography (PET)/computed tomography (CT) to assess early changes in glucose metabolism parameters during neoadjuvant chemotherapy can predict treatment efficacy[24,25]. Traditional 18F fluorodeoxyglucose-PET/CT cannot accurately and safely select patients for organ preservation strategies[26]. Williams et al[27] suggested that random forest is one type of AI technique used for tumor classification and regression evaluation. Shen et al[28] used random forest to demonstrate that the radiomics obtained from baseline 18F fluorodeoxyglucose-PET could accurately predict pathological complete response with 95.3% accuracy.

APPLICATION OF AI IN BREAST TUMORS

Ultrasound and radiology are common imaging techniques in breast examination for cancer screening, diagnosis and treatment. Ultrasound is important for the noninvasive measurement of cancer lesions and lymphatic metastasis, increasing the positive diagnostic rate for tiny, aggressive and lymph node-negative breast cancer[29]. However, ultrasound has lower diagnostic specificity and PPV for breast cancer[30]. For example, the axillary positive detection rate of pathological biopsy is 15% to 20%, which is often neglected by ultrasound, especially in those with unspecific characteristics, such as unclear, irregularly shaped edges or fat loss[31]. Although MRI is highly sensitive for the diagnosis of breast cancer, its FPR is as high as 74%[32]. Molybdenum target X-rays are sensitive to microcalcification with the advantage of high cost performance. However, regarding dense breasts where lesions are probably hidden, molybdenum target X-ray has limitations with a lower detection rate[33].

Zhou et al[29] proposed a CNN-based deep learning model to predict lymph node metastasis according to the characteristics of primary breast cancer under ultrasound. The data showed that its AUC was approximately 90%, and the sensitivity and specificity were above 80% and 70%, respectively. Mango et al[30] integrated their AI-based decision support system into ultrasonic images, and the results showed that this technique is helpful in Breast Imaging Reporting and Data System classification, reducing the intraobserver and interobserver variabilities. The variability incidence of ultrasound only in Breast Imaging Reporting and Data System 3 to Breast Imaging Reporting and Data System 4A or above was 13.6%, and it decreased to 10.8% when ultrasound was combined with decision support.

Spick et al[34] showed that adding diffusion-weighted imaging into MRI-guided vacuum-assisted breast biopsy could reduce the FPR by more than 30%. Penco et al[32] verified the accuracy of MRI-guided vacuum-assisted breast biopsy in comparison with histopathological results. The results exhibited 94% accuracy, 84% sensitivity and 77% specificity, with a negative predictive value of up to 97%. Adachi et al[31] compared the diagnostic performance in dynamic contrast-enhanced magnetic resonance for breast cancer detection of AI using RetinaNet to that of expert readers; the former had a higher diagnostic performance than the latter (AUC 0.925 vs 0.884). With the support of AI, the diagnostic performance of expert readers was significantly improved (AUC was 0.899). The sensitivity and specificity of independent AI, experts not using AI and experts using AI in breast cancer diagnosis were 0.926, 0.847, 0.889 and 0.828, 0.841, 0.823, respectively. However, AI may misdiagnose normal breast tissue as malignant due to background parenchymal enhancement or tissue density or misdiagnose invasive ductal carcinoma near the axilla as normal axillary lymph nodes[31].

Sasaki et al[35] proposed that AI-based Transpara systems reduced the differences between computers and experts in the detection sensitivity to breast cancer via molybdenum targets. The expert detection sensitivity was 89%; with the Transpara system, the detection sensitivity for malignant lesions was increased to 95%[35]. When interpreting breast images, the Transpara system can significantly increase AUC and diagnostic sensitivity without increasing reading time[36].

In summary, AI technology increases the detection sensitivity of latent breast lesions while maintaining higher specificity. This technology also reduces the variability in interpretation and helps to improve the clinical diagnostic performance.

APPLICATION OF AI IN THYROID TUMORS

In recent years, with the increasing incidence rate of thyroid cancer, the accurate classification of thyroid lesions and the prediction of lymph node metastasis have been prioritized to be the core of clinical intervention[37,38]. Ultrasound is a noninvasive, easily accessible and economical examination tool, but its accuracy may vary according to the different professional backgrounds of the readers.

Barczyński et al[39] verified that the S-DetectTM model in real-time CAD system had no significant difference from experienced radiologists in sensitivity, accuracy and negative predictive value of thyroid tumor classification. The overall accuracy of disease evaluation was 76% for surgical doctors who had basic ultrasonic skills not using the CAD system but 82% for doctors with experience using the CAD system[39]. The sensitivity and negative predictive value of lesion classification by the CAD system was similar to those by ultrasonic experts. It further helped to locate the thyroid nodules for further puncture cytology. Nevertheless, the S-DetectTM model had defects in identifying calcifications[40].

Postoperative lymph node metastasis is a key factor in the local recurrence of thyroid carcinoma. It is necessary to use CT or ultrasound to judge whether lymph node metastasis is present before surgery[37,38]. A study conducted by Lee et al[41] confirmed that the AUC of the CAD system based on deep learning in the classification of thyroid neck lymph node metastasis from preoperative CT images was 0.884, and its diagnostic accuracy, sensitivity, specificity, PPV and negative predictive value were 82.8%, 80.2%, 83.0%, 83.0% and 80.2%, respectively.

APPLICATION OF AI IN PROSTATE CANCER

Serum prostate specific antigen (PSA), digital rectal examination and transrectal prostate ultrasound-guided prostate puncture are the main methods for the early diagnosis of prostate cancer[42]. High-level PSA (> 2 ng/mL) is an important indicator of postoperative monitoring and identifying the recurrence of prostate cancer[43].

Biopsy technology guided by MRI/ultrasound improves the clinical detection of prostate cancer[44,45]. MRI detects pathological changes of Prostate Imaging Reporting and Data System classification is affected by poor intrareader and inter-reader consistency, leading to a 40% difference in targeted biopsy. By adding AI, it will converge Prostate Imaging Reporting and Data System and improve reader consistency, achieving a better (86%) agreement of detected results and pathological diagnosis[46].

Deep learning applications in the field of prostate malignant tumors have been widely used with MRI[47,48]. Although some patients were treated with radical prostate surgery and serum prostate specific antigen < 1, 11C-choline PET/CT still showed a 20.5% positive rate[49]. Prostate uptake of 18F-choline is associated with the overall survival rate, making it as important as serum prostate specific antigen and Gleason scores in identifying high-risk and low-risk patients. Polymeri et al[50] used an automatic estimation method based on deep learning, and the obtained 18F-choline uptake value (71 mL) could reach radiologists’ visual estimates (65 mL and 80 mL) within seconds. This approach significantly improved the accuracy and precision of PET/CT imaging in the diagnosis of prostate cancer.

Raciti et al[43] used the software Paige Prostate Alpha to significantly increase the detection rate of prostate cancer while maintaining high specificity. Especially for small, poorly differentiated tumors, the sensitivity can be increased to 30% up to 90%. Similar AI systems can also be used to detect micrometastases in prostate cancer.

APPLICATION OF AI IN LUNG CANCER

When using CT to screen pulmonary nodules, lung-Reporting and Data System can increase sensitivity, but its FPR is also high[51]. The CAD method has 100% sensitivity, but its specificity is extremely low (up to 8.2 false positive nodules per scan)[51]. The negative predictive value of PET/CT for lymph node lesions of peripheral T1 tumors (≤ 3 cm) is as high as 92%-94%[52].

Chauvie et al[51] attempted to apply new methods to digital tomosynthesis: (1) Binomial visual analysis, PPV (0.14) and sensitivity (0.95); (2) Pulmonary-Reporting and Data System, PPV (0.19) and sensitivity (0.65); (3) Logistic regression, PPV (0.29) and sensitivity (0.20); (4) Random forest, PPV (0.40) and sensitivity (0.30); and (5) Neural network, PPV (0.95) and sensitivity (0.90). These data indicated that the neural network was the only predictor of lung cancer with a high PPV value and no loss in sensitivity. Tau et al[52] used CNN to analyze the characteristics of the primary tumor based on PET and to evaluate the existence of lymph node metastasis in newly diagnosed non-small cell lung cancer patients. The sensitivity, specificity and accuracy of predicting positive lymph nodes were 0.74 ± 0.32, 0.84 ± 0.16 and 0.80 ± 0.17, respectively; those of predicting distal metastasis were 0.45 ± 0.08, 0.79 ± 0.06 and 0.63 ± 0.05, respectively. The sensitivity of predicting distant lymph node metastasis was low (24% at prophase and 45% at the end of the monitoring period). CNN had high specificity (91% in the M1 group and 79% in the follow-up group), but the PPV and negative predictive value in class M were lower at the end of follow-up (54.5% and 68.6%).

AI APPLICATION IN OTHER SOLID TUMORS
Hepatocellular carcinoma

The texture analysis of contrast-enhanced magnetic resonance is considered an image tag for predicting the early reaction of hepatocellular carcinoma patients before transarterial chemoembolization (TACE) treatment[53]. Its accuracy for the evaluation of complete remission and incomplete remission was 0.76. Preoperative dynamic CT texture analysis in the prediction of hepatocellular carcinoma response to TACE treatment has certain value. Peng et al[54] used a CT-based deep learning technique (transfer learning) that compensated for the inaccuracy of the result caused by insufficient image information. Further studies showed that the three groups (one training set and two validation sets) of data showed a high AUC for predicting the response to TACE treatment: complete response (0.97, 0.98, 0.97), partial response (0.96, 0.96, 0.96), stable condition (0.95, 0.95, 0.94) and disease progression (0.96, 0.94, 0.97); simultaneously, the accuracy reached 84.0%, 85.1% and 82.8%[54]. Therefore, the CT-based deep learning model helps physicians preliminarily estimate the initial response of hepatocellular carcinoma patients to TACE treatment and helps to predict the therapeutic effect of TACE.

Cervical cancer

Colposcopy is widely used in the detection of cervical intraepithelial neoplasia, and it can guide cervical biopsy in women suspected of having cytological abnormalities or human papillomavirus infection[55,56]. In low- and middle-income countries with a lack of tools for colposcopy, the diagnostic accuracy of cervical biopsy to detect cervical intraepithelial neoplasia is quite low (30%-70%)[57]. The development and application of AI-guided (e.g., support vector machine) digital colposcopy helped solve the bottlenecks and improved the screening effectiveness of cervical cancer to better understand the characteristics of cervical lesions[58]. Another advantage of AI is the “real-time” diagnosis report, which continues to optimize clinical workflows[58].

Pancreatic cancer

Accurate segmentation of the pancreas is important to AI training and AI assisted guidance. Wolz et al[59] used multi atlas technology, which only achieved a dice similarity coefficient (DSC) of 0.70. Summers et al[60] used deep learning technology, which reached a DSC of 0.78%. Wang et al[61] proposed that interactive fully convolutional network for the segmentation of the pancreas did not achieve satisfactory results. Boers et al[62] assumed that the latest interactive U-Net neural structure is better than interactive fully convolutional network because it can produce a better initial segmentation (DSC 78.1% ± 8.7% vs DSC 72.3% ± 11.4%), achieving expert performance faster than artificial division (interactive U-net 8 min to 86% DSC, artificial segmentation 15 min to 87.5% DSC). The average time cost fell 48.4%, but simultaneously due to the low content of visceral fat in some patients, the boundary between the pancreas and surrounding tissues was not clear, which may lead to poor segmentation performance.

Renal cancer

Histopathology is the gold standard for clear cell renal cell carcinoma evaluation[63]. The World Health Organization/International Society of Urological Pathology grading system is used to predict the prognosis of renal clear cell carcinoma[64-66]. Using CT or MRI indications to describe the grading of clear cell renal cell carcinoma is often influenced by subjective factors[67-70]. Cui et al[71] studied the machine learning algorithm to extract and analyze the profiles of tiny tumors. Further grading prediction of clear cell renal cell carcinoma by multiparameter MRI or multiphase CT-based machine learning provides a valuable noninvasive assessment for clinicians in the preoperative treatment of renal tumors[71].

CONCLUSION

AI has clear characteristics of high efficiency, specificity and sensitivity in the classification, identification and diagnosis of solid tumor. After its integration into imaging technology, AI optimizes clinical workflows, decreases the discrepancy between the readers and reduces the misdiagnosis rate, which helps clinicians effectively choose appropriate therapeutic strategies and accurately predict the prognosis (Table 1). All these improvements bring great advantages and convenience to current precision medicine. Nevertheless, problems still exist. For example, the FPR increases due to the morphology of the tumors or the uneven mucosal background and the identification failure of calcification because of technical defects. Therefore, AI cannot be a complete replacement of humans in the contemporary situation. We believe that with the continuous improvement of AI technology, the application of AI in tumor diagnosis and treatment will have better prospects in tumors not limited only to solid tumors.

Table 1 Summary of artificial intelligence application in clinical imaging examination.
Publish date
Ref.
AI
Application scenarios
Sensitivity
Accuracy
Specificity
PPV
NPV
Detection time
Variation
Volume
AUC
DSC
10/2020Fukuda et al[16]CNNDiagnosis of esophagus squamous cell cancer91.1%88.3%
05/2020Zhang et al[7]CNNDiagnosis of chronic atrophic gastritis94.5%94.2%94.0%0.99
10/2020Horiuchi et al[19]CADDiagnosis of early gastric cancer87.4%85.1%82.8%83.5%86.7%0.8684
02/2020Wang et al[23]Faster R-CNNCircumferential resection margin of rectal cancer83.8%93.2%95.6%
03/2020Shen et al[28]RFPathological complete response of rectal cancer95.3%
01/2021Abe et al[18]CNNDiagnosis of gastric cancer58.4%87.3%26.0%45.5 s
01/2020Zhou et al[29]CNNLymph node metastasis prediction from primary breast cancer> 80%> 70%0.9
03/2020Penco et al[32]DWIMRI-guided vacuum-assisted breast biopsy84.0%94.0%77.0%97.0%
05/2020Adachi et al[31]RetinaNetDiagnosis of breast cancer92.6%82.8%0.925
Readers without RetinaNet84.7%84.1%0.884
Readers with RetinaNet88.9%82.3%0.899
02/2020Sasaki et al[35]ExpertsDiagnosis of breast cancer89.0%
Experts with Transpara system95.0%
06/2020Mango et al[30]USDiagnosis of BI-RADS 3 to BI-RADS 4A or above of breast cancer13.6%
US+DS10.8%
02/2020Barczyński et al[39]Doctors without CADClassification of thyroid tumor76.0%
Doctors with CAD82.0%
06/2020Lee et al[41]CADDiagnosis of thyroid neck lymph node metastasis80.2%82.8%83.0%83.0%80.2%0.884
03/2020Polymeri et al[50]CNNProstate gland uptake in PET/CT71 mL
10/2020Raciti et al[43]Paige Prostate AlphaDiagnosis of prostate cancer90.0%
07/2020Chauvie et al[51]Binomial visual analysisLung DTS95.0%14.0%
Pulmonary-RADS65.0%19.0%
Logistic regression20.0%29.0%
RF30.0%40.0%
Neural network90.0%95.0%
07/2020Tau et al[52]CNNDiagnosis of lymph node metastasis of lung cancer74% ± 32%80% ± 17%84% ± 16%
Predicting of distal metastasis of lung cancer45% ± 8%63% ± 5%79% ± 6%54.5%68.6%
01/2020Peng et al[54]Transfer learningPredicting of TACE treatment response of hepatocellular carcinoma> 82.8%> 0.94
09/2013Wolz et al[59]Multi atlas technologySegmentation of the pancreas70.0%
08/2020Gibson et al[62]Deep learning technology78.0%
iFCN72.3% ± 11.4%
Artificial segmentation15 min to 87.5% DSC
Footnotes

Manuscript source: Invited manuscript

Specialty type: Methodology

Country/Territory of origin: China

Peer-review report’s scientific quality classification

Grade A (Excellent): 0

Grade B (Very good): 0

Grade C (Good): C

Grade D (Fair): D

Grade E (Poor): 0

P-Reviewer: Hong YY, Liu GH S-Editor: Wang JL L-Editor: Filipodia P-Editor: Yuan YY

References
1.  Zhou B, Xu JW, Cheng YG, Gao JY, Hu SY, Wang L, Zhan HX. Early detection of pancreatic cancer: Where are we now and where are we going? Int J Cancer. 2017;141:231-241.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 118]  [Cited by in F6Publishing: 127]  [Article Influence: 18.1]  [Reference Citation Analysis (0)]
2.  Ding MQ, Chen L, Cooper GF, Young JD, Lu X. Precision Oncology beyond Targeted Therapy: Combining Omics Data with Machine Learning Matches the Majority of Cancer Cells to Effective Therapeutics. Mol Cancer Res. 2018;16:269-278.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 81]  [Cited by in F6Publishing: 89]  [Article Influence: 12.7]  [Reference Citation Analysis (0)]
3.  Bibault JE, Giraud P, Burgun A. Big Data and machine learning in radiation oncology: State of the art and future prospects. Cancer Lett. 2016;382:110-117.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 179]  [Cited by in F6Publishing: 171]  [Article Influence: 21.4]  [Reference Citation Analysis (0)]
4.  Fröhlich H, Balling R, Beerenwinkel N, Kohlbacher O, Kumar S, Lengauer T, Maathuis MH, Moreau Y, Murphy SA, Przytycka TM, Rebhan M, Röst H, Schuppert A, Schwab M, Spang R, Stekhoven D, Sun J, Weber A, Ziemek D, Zupan B. From hype to reality: data science enabling personalized medicine. BMC Med. 2018;16:150.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 260]  [Cited by in F6Publishing: 167]  [Article Influence: 27.8]  [Reference Citation Analysis (0)]
5.  Feng W, Ding Y, Zong W, Ju S. Non-coding RNAs in regulating gastric cancer metastasis. Clin Chim Acta. 2019;496:125-133.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 16]  [Cited by in F6Publishing: 16]  [Article Influence: 3.2]  [Reference Citation Analysis (0)]
6.  Du Y, Bai Y, Xie P, Fang J, Wang X, Hou X, Tian D, Wang C, Liu Y, Sha W, Wang B, Li Y, Zhang G, Shi R, Xu J, Huang M, Han S, Liu J, Ren X, Wang Z, Cui L, Sheng J, Luo H, Zhao X, Dai N, Nie Y, Zou Y, Xia B, Fan Z, Chen Z, Lin S, Li ZS;  Chinese Chronic Gastritis Research group. Chronic gastritis in China: a national multi-center survey. BMC Gastroenterol. 2014;14:21.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 55]  [Cited by in F6Publishing: 82]  [Article Influence: 8.2]  [Reference Citation Analysis (0)]
7.  Zhang Y, Li F, Yuan F, Zhang K, Huo L, Dong Z, Lang Y, Zhang Y, Wang M, Gao Z, Qin Z, Shen L. Diagnosing chronic atrophic gastritis by gastroscopy using artificial intelligence. Dig Liver Dis. 2020;52:566-572.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 45]  [Cited by in F6Publishing: 58]  [Article Influence: 14.5]  [Reference Citation Analysis (0)]
8.  Guimarães P, Keller A, Fehlmann T, Lammert F, Casper M. Deep-learning based detection of gastric precancerous conditions. Gut. 2020;69:4-6.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 58]  [Cited by in F6Publishing: 63]  [Article Influence: 15.8]  [Reference Citation Analysis (0)]
9.  Liu T, Zheng H, Gong W, Chen C, Jiang B. The accuracy of confocal laser endomicroscopy, narrow band imaging, and chromoendoscopy for the detection of atrophic gastritis. J Clin Gastroenterol. 2015;49:379-386.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 21]  [Cited by in F6Publishing: 21]  [Article Influence: 2.3]  [Reference Citation Analysis (0)]
10.  Billah M, Waheed S, Rahman MM. An Automatic Gastrointestinal Polyp Detection System in Video Endoscopy Using Fusion of Color Wavelet and Convolutional Neural Network Features. Int J Biomed Imaging. 2017;2017:9545920.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 63]  [Cited by in F6Publishing: 71]  [Article Influence: 10.1]  [Reference Citation Analysis (0)]
11.  Urban G, Tripathi P, Alkayali T, Mittal M, Jalali F, Karnes W, Baldi P. Deep Learning Localizes and Identifies Polyps in Real Time With 96% Accuracy in Screening Colonoscopy. Gastroenterology 2018; 155: 1069-1078. e8.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 398]  [Cited by in F6Publishing: 373]  [Article Influence: 62.2]  [Reference Citation Analysis (1)]
12.  Lahner E, Grossi E, Intraligi M, Buscema M, Corleto VD, Delle Fave G, Annibale B. Possible contribution of artificial neural networks and linear discriminant analysis in recognition of patients with suspected atrophic body gastritis. World J Gastroenterol. 2005;11:5867-5873.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in CrossRef: 17]  [Cited by in F6Publishing: 15]  [Article Influence: 0.8]  [Reference Citation Analysis (0)]
13.  Muto M, Minashi K, Yano T, Saito Y, Oda I, Nonaka S, Omori T, Sugiura H, Goda K, Kaise M, Inoue H, Ishikawa H, Ochiai A, Shimoda T, Watanabe H, Tajiri H, Saito D. Early detection of superficial squamous cell carcinoma in the head and neck region and esophagus by narrow band imaging: a multicenter randomized controlled trial. J Clin Oncol. 2010;28:1566-1572.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 427]  [Cited by in F6Publishing: 476]  [Article Influence: 34.0]  [Reference Citation Analysis (0)]
14.  Ohmori M, Ishihara R, Aoyama K, Nakagawa K, Iwagami H, Matsuura N, Shichijo S, Yamamoto K, Nagaike K, Nakahara M, Inoue T, Aoi K, Okada H, Tada T. Endoscopic detection and differentiation of esophageal lesions using a deep neural network. Gastrointest Endosc 2020; 91: 301-309. e1.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 74]  [Cited by in F6Publishing: 72]  [Article Influence: 18.0]  [Reference Citation Analysis (0)]
15.  Ishihara R, Takeuchi Y, Chatani R, Kidu T, Inoue T, Hanaoka N, Yamamoto S, Higashino K, Uedo N, Iishi H, Tatsuta M, Tomita Y, Ishiguro S. Prospective evaluation of narrow-band imaging endoscopy for screening of esophageal squamous mucosal high-grade neoplasia in experienced and less experienced endoscopists. Dis Esophagus. 2010;23:480-486.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 80]  [Cited by in F6Publishing: 87]  [Article Influence: 6.2]  [Reference Citation Analysis (0)]
16.  Fukuda H, Ishihara R, Kato Y, Matsunaga T, Nishida T, Yamada T, Ogiyama H, Horie M, Kinoshita K, Tada T. Comparison of performances of artificial intelligence versus expert endoscopists for real-time assisted diagnosis of esophageal squamous cell carcinoma (with video). Gastrointest Endosc. 2020;92:848-855.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 45]  [Cited by in F6Publishing: 52]  [Article Influence: 13.0]  [Reference Citation Analysis (0)]
17.  Iwagami H, Ishihara R, Aoyama K, Fukuda H, Shimamoto Y, Kono M, Nakahira H, Matsuura N, Shichijo S, Kanesaka T, Kanzaki H, Ishii T, Nakatani Y, Tada T. Artificial intelligence for the detection of esophageal and esophagogastric junctional adenocarcinoma. J Gastroenterol Hepatol. 2021;36:131-136.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 16]  [Cited by in F6Publishing: 15]  [Article Influence: 5.0]  [Reference Citation Analysis (0)]
18.  Abe S, Oda I. How can endoscopists adapt and collaborate with artificial intelligence for early gastric cancer detection? Dig Endosc. 2021;33:98-99.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 5]  [Cited by in F6Publishing: 4]  [Article Influence: 1.3]  [Reference Citation Analysis (0)]
19.  Horiuchi Y, Hirasawa T, Ishizuka N, Tokai Y, Namikawa K, Yoshimizu S, Ishiyama A, Yoshio T, Tsuchida T, Fujisaki J, Tada T. Performance of a computer-aided diagnosis system in diagnosing early gastric cancer using magnifying endoscopy videos with narrow-band imaging (with videos). Gastrointest Endosc 2020; 92: 856-865. e1.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 40]  [Cited by in F6Publishing: 42]  [Article Influence: 10.5]  [Reference Citation Analysis (0)]
20.  Horiuchi Y, Aoyama K, Tokai Y, Hirasawa T, Yoshimizu S, Ishiyama A, Yoshio T, Tsuchida T, Fujisaki J, Tada T. Convolutional Neural Network for Differentiating Gastric Cancer from Gastritis Using Magnified Endoscopy with Narrow Band Imaging. Dig Dis Sci. 2020;65:1355-1363.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 70]  [Cited by in F6Publishing: 78]  [Article Influence: 19.5]  [Reference Citation Analysis (1)]
21.  Shahidi N, Rex DK, Kaltenbach T, Rastogi A, Ghalehjegh SH, Byrne MF. Use of Endoscopic Impression, Artificial Intelligence, and Pathologist Interpretation to Resolve Discrepancies Between Endoscopy and Pathology Analyses of Diminutive Colorectal Polyps. Gastroenterology 2020; 158: 783-785. e1.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 27]  [Cited by in F6Publishing: 26]  [Article Influence: 6.5]  [Reference Citation Analysis (0)]
22.  Yang YJ, Cho BJ, Lee MJ, Kim JH, Lim H, Bang CS, Jeong HM, Hong JT, Baik GH. Automated Classification of Colorectal Neoplasms in White-Light Colonoscopy Images via Deep Learning. J Clin Med. 2020;9.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 33]  [Cited by in F6Publishing: 21]  [Article Influence: 5.3]  [Reference Citation Analysis (0)]
23.  Wang D, Xu J, Zhang Z, Li S, Zhang X, Zhou Y, Lu Y. Evaluation of Rectal Cancer Circumferential Resection Margin Using Faster Region-Based Convolutional Neural Network in High-Resolution Magnetic Resonance Images. Dis Colon Rectum. 2020;63:143-151.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 14]  [Cited by in F6Publishing: 14]  [Article Influence: 3.5]  [Reference Citation Analysis (0)]
24.  Guillem JG, Moore HG, Akhurst T, Klimstra DS, Ruo L, Mazumdar M, Minsky BD, Saltz L, Wong WD, Larson S. Sequential preoperative fluorodeoxyglucose-positron emission tomography assessment of response to preoperative chemoradiation: a means for determining longterm outcomes of rectal cancer. J Am Coll Surg. 2004;199:1-7.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 150]  [Cited by in F6Publishing: 160]  [Article Influence: 8.0]  [Reference Citation Analysis (0)]
25.  Capirci C, Rampin L, Erba PA, Galeotti F, Crepaldi G, Banti E, Gava M, Fanti S, Mariani G, Muzzio PC, Rubello D. Sequential FDG-PET/CT reliably predicts response of locally advanced rectal cancer to neo-adjuvant chemo-radiation therapy. Eur J Nucl Med Mol Imaging. 2007;34:1583-1593.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 130]  [Cited by in F6Publishing: 119]  [Article Influence: 7.0]  [Reference Citation Analysis (0)]
26.  Joye I, Deroose CM, Vandecaveye V, Haustermans K. The role of diffusion-weighted MRI and (18)F-FDG PET/CT in the prediction of pathologic complete response after radiochemotherapy for rectal cancer: a systematic review. Radiother Oncol. 2014;113:158-165.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 122]  [Cited by in F6Publishing: 108]  [Article Influence: 12.0]  [Reference Citation Analysis (0)]
27.  Williams JK. Using random forests to diagnose aviation turbulence. Mach Learn. 2014;95:51-70.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 63]  [Cited by in F6Publishing: 48]  [Article Influence: 4.4]  [Reference Citation Analysis (0)]
28.  Shen WC, Chen SW, Wu KC, Lee PY, Feng CL, Hsieh TC, Yen KY, Kao CH. Predicting pathological complete response in rectal cancer after chemoradiotherapy with a random forest using 18F-fluorodeoxyglucose positron emission tomography and computed tomography radiomics. Ann Transl Med. 2020;8:207.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 12]  [Cited by in F6Publishing: 15]  [Article Influence: 3.8]  [Reference Citation Analysis (0)]
29.  Zhou LQ, Wu XL, Huang SY, Wu GG, Ye HR, Wei Q, Bao LY, Deng YB, Li XR, Cui XW, Dietrich CF. Lymph Node Metastasis Prediction from Primary Breast Cancer US Images Using Deep Learning. Radiology. 2020;294:19-28.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 195]  [Cited by in F6Publishing: 147]  [Article Influence: 36.8]  [Reference Citation Analysis (0)]
30.  Mango VL, Sun M, Wynn RT, Ha R. Should We Ignore, Follow, or Biopsy? AJR Am J Roentgenol. 2020;214:1445-1452.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 34]  [Cited by in F6Publishing: 43]  [Article Influence: 10.8]  [Reference Citation Analysis (0)]
31.  Adachi M, Fujioka T, Mori M, Kubota K, Kikuchi Y, Xiaotong W, Oyama J, Kimura K, Oda G, Nakagawa T, Uetake H, Tateishi U. Detection and Diagnosis of Breast Cancer Using Artificial Intelligence Based assessment of Maximum Intensity Projection Dynamic Contrast-Enhanced Magnetic Resonance Images. Diagnostics (Basel). 2020;10.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 19]  [Cited by in F6Publishing: 26]  [Article Influence: 6.5]  [Reference Citation Analysis (0)]
32.  Penco S, Rotili A, Pesapane F, Trentin C, Dominelli V, Faggian A, Farina M, Marinucci I, Bozzini A, Pizzamiglio M, Ierardi AM, Cassano E. MRI-guided vacuum-assisted breast biopsy: experience of a single tertiary referral cancer centre and prospects for the future. Med Oncol. 2020;37:36.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 4]  [Cited by in F6Publishing: 4]  [Article Influence: 1.0]  [Reference Citation Analysis (0)]
33.  Hu Y, Zhang Y, Cheng J. Diagnostic value of molybdenum target combined with DCE-MRI in different types of breast cancer. Oncol Lett. 2019;18:4056-4063.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 5]  [Cited by in F6Publishing: 5]  [Article Influence: 1.0]  [Reference Citation Analysis (0)]
34.  Spick C, Pinker-Domenig K, Rudas M, Helbich TH, Baltzer PA. MRI-only lesions: application of diffusion-weighted imaging obviates unnecessary MR-guided breast biopsies. Eur Radiol. 2014;24:1204-1210.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 76]  [Cited by in F6Publishing: 79]  [Article Influence: 7.9]  [Reference Citation Analysis (0)]
35.  Sasaki M, Tozaki M, Rodríguez-Ruiz A, Yotsumoto D, Ichiki Y, Terawaki A, Oosako S, Sagara Y. Artificial intelligence for breast cancer detection in mammography: experience of use of the ScreenPoint Medical Transpara system in 310 Japanese women. Breast Cancer. 2020;27:642-651.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 19]  [Cited by in F6Publishing: 26]  [Article Influence: 6.5]  [Reference Citation Analysis (0)]
36.  Rodriguez-Ruiz A, Lång K, Gubern-Merida A, Broeders M, Gennaro G, Clauser P, Helbich TH, Chevalier M, Tan T, Mertelmeier T, Wallis MG, Andersson I, Zackrisson S, Mann RM, Sechopoulos I. Stand-Alone Artificial Intelligence for Breast Cancer Detection in Mammography: Comparison With 101 Radiologists. J Natl Cancer Inst. 2019;111:916-922.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 211]  [Cited by in F6Publishing: 263]  [Article Influence: 65.8]  [Reference Citation Analysis (0)]
37.  Shin JH, Baek JH, Chung J, Ha EJ, Kim JH, Lee YH, Lim HK, Moon WJ, Na DG, Park JS, Choi YJ, Hahn SY, Jeon SJ, Jung SL, Kim DW, Kim EK, Kwak JY, Lee CY, Lee HJ, Lee JH, Lee KH, Park SW, Sung JY;  Korean Society of Thyroid Radiology (KSThR) and Korean Society of Radiology. Ultrasonography Diagnosis and Imaging-Based Management of Thyroid Nodules: Revised Korean Society of Thyroid Radiology Consensus Statement and Recommendations. Korean J Radiol. 2016;17:370-395.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 509]  [Cited by in F6Publishing: 590]  [Article Influence: 73.8]  [Reference Citation Analysis (0)]
38.  Haugen BR. 2015 American Thyroid Association Management Guidelines for Adult Patients with Thyroid Nodules and Differentiated Thyroid Cancer: What is new and what has changed? Cancer. 2017;123:372-381.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 273]  [Cited by in F6Publishing: 352]  [Article Influence: 44.0]  [Reference Citation Analysis (0)]
39.  Barczyński M, Stopa-Barczyńska M, Wojtczak B, Czarniecka A, Konturek A. Clinical validation of S-DetectTM mode in semi-automated ultrasound classification of thyroid lesions in surgical office. Gland Surg. 2020;9:S77-S85.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 10]  [Cited by in F6Publishing: 18]  [Article Influence: 4.5]  [Reference Citation Analysis (0)]
40.  Kim HL, Ha EJ, Han M. Real-World Performance of Computer-Aided Diagnosis System for Thyroid Nodules Using Ultrasonography. Ultrasound Med Biol. 2019;45:2672-2678.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 39]  [Cited by in F6Publishing: 32]  [Article Influence: 6.4]  [Reference Citation Analysis (0)]
41.  Lee JH, Ha EJ, Kim D, Jung YJ, Heo S, Jang YH, An SH, Lee K. Application of deep learning to the diagnosis of cervical lymph node metastasis from thyroid cancer with CT: external validation and clinical utility for resident training. Eur Radiol. 2020;30:3066-3072.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 21]  [Cited by in F6Publishing: 39]  [Article Influence: 9.8]  [Reference Citation Analysis (0)]
42.  Kroenig M, Schaal K, Benndorf M, Soschynski M, Lenz P, Krauss T, Drendel V, Kayser G, Kurz P, Werner M, Wetterauer U, Schultze-Seemann W, Langer M, Jilg CA. Diagnostic Accuracy of Robot-Guided, Software Based Transperineal MRI/TRUS Fusion Biopsy of the Prostate in a High Risk Population of Previously Biopsy Negative Men. Biomed Res Int. 2016;2016:2384894.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 13]  [Cited by in F6Publishing: 14]  [Article Influence: 1.8]  [Reference Citation Analysis (0)]
43.  Raciti P, Sue J, Ceballos R, Godrich R, Kunz JD, Kapur S, Reuter V, Grady L, Kanan C, Klimstra DS, Fuchs TJ. Novel artificial intelligence system increases the detection of prostate cancer in whole slide images of core needle biopsies. Mod Pathol. 2020;33:2058-2066.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 60]  [Cited by in F6Publishing: 79]  [Article Influence: 19.8]  [Reference Citation Analysis (0)]
44.  Siddiqui MM, Rais-Bahrami S, Turkbey B, George AK, Rothwax J, Shakir N, Okoro C, Raskolnikov D, Parnes HL, Linehan WM, Merino MJ, Simon RM, Choyke PL, Wood BJ, Pinto PA. Comparison of MR/ultrasound fusion-guided biopsy with ultrasound-guided biopsy for the diagnosis of prostate cancer. JAMA. 2015;313:390-397.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1068]  [Cited by in F6Publishing: 1095]  [Article Influence: 121.7]  [Reference Citation Analysis (0)]
45.  Kasivisvanathan V, Rannikko AS, Borghi M, Panebianco V, Mynderse LA, Vaarala MH, Briganti A, Budäus L, Hellawell G, Hindley RG, Roobol MJ, Eggener S, Ghei M, Villers A, Bladou F, Villeirs GM, Virdi J, Boxler S, Robert G, Singh PB, Venderink W, Hadaschik BA, Ruffion A, Hu JC, Margolis D, Crouzet S, Klotz L, Taneja SS, Pinto P, Gill I, Allen C, Giganti F, Freeman A, Morris S, Punwani S, Williams NR, Brew-Graves C, Deeks J, Takwoingi Y, Emberton M, Moore CM;  PRECISION Study Group Collaborators. MRI-Targeted or Standard Biopsy for Prostate-Cancer Diagnosis. N Engl J Med. 2018;378:1767-1777.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1568]  [Cited by in F6Publishing: 1800]  [Article Influence: 300.0]  [Reference Citation Analysis (0)]
46.  Sanford T, Harmon SA, Turkbey EB, Kesani D, Tuncer S, Madariaga M, Yang C, Sackett J, Mehralivand S, Yan P, Xu S, Wood BJ, Merino MJ, Pinto PA, Choyke PL, Turkbey B. Deep-Learning-Based Artificial Intelligence for PI-RADS Classification to Assist Multiparametric Prostate MRI Interpretation: A Development Study. J Magn Reson Imaging. 2020;52:1499-1507.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 31]  [Cited by in F6Publishing: 47]  [Article Influence: 11.8]  [Reference Citation Analysis (0)]
47.  Reda I, Khalil A, Elmogy M, Abou El-Fetouh A, Shalaby A, Abou El-Ghar M, Elmaghraby A, Ghazal M, El-Baz A. Deep Learning Role in Early Diagnosis of Prostate Cancer. Technol Cancer Res Treat. 2018;17:1533034618775530.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 40]  [Cited by in F6Publishing: 43]  [Article Influence: 8.6]  [Reference Citation Analysis (0)]
48.  Wang X, Yang W, Weinreb J, Han J, Li Q, Kong X, Yan Y, Ke Z, Luo B, Liu T, Wang L. Searching for prostate cancer by fully automated magnetic resonance imaging classification: deep learning vs non-deep learning. Sci Rep. 2017;7:15415.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 112]  [Cited by in F6Publishing: 92]  [Article Influence: 13.1]  [Reference Citation Analysis (0)]
49.  Giovacchini G, Guglielmo P, Mapelli P, Incerti E, Gajate AMS, Giovannini E, Riondato M, Briganti A, Gianolli L, Ciarmiello A, Picchio M. 11C-choline PET/CT predicts survival in prostate cancer patients with PSA < 1 NG/mL. Eur J Nucl Med Mol Imaging. 2019;46:921-929.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 11]  [Cited by in F6Publishing: 14]  [Article Influence: 2.8]  [Reference Citation Analysis (0)]
50.  Polymeri E, Sadik M, Kaboteh R, Borrelli P, Enqvist O, Ulén J, Ohlsson M, Trägårdh E, Poulsen MH, Simonsen JA, Hoilund-Carlsen PF, Johnsson ÅA, Edenbrandt L. Deep learning-based quantification of PET/CT prostate gland uptake: association with overall survival. Clin Physiol Funct Imaging. 2020;40:106-113.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 20]  [Cited by in F6Publishing: 24]  [Article Influence: 4.8]  [Reference Citation Analysis (0)]
51.  Chauvie S, De Maggi A, Baralis I, Dalmasso F, Berchialla P, Priotto R, Violino P, Mazza F, Melloni G, Grosso M;  SOS Study team. Artificial intelligence and radiomics enhance the positive predictive value of digital chest tomosynthesis for lung cancer detection within SOS clinical trial. Eur Radiol. 2020;30:4134-4140.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 11]  [Cited by in F6Publishing: 7]  [Article Influence: 1.8]  [Reference Citation Analysis (1)]
52.  Tau N, Stundzia A, Yasufuku K, Hussey D, Metser U. Convolutional Neural Networks in Predicting Nodal and Distant Metastatic Potential of Newly Diagnosed Non-Small Cell Lung Cancer on FDG PET Images. AJR Am J Roentgenol. 2020;215:192-197.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 23]  [Cited by in F6Publishing: 23]  [Article Influence: 5.8]  [Reference Citation Analysis (0)]
53.  Yu JY, Zhang HP, Tang ZY, Zhou J, He XJ, Liu YY, Liu XJ, Guo DJ. Value of texture analysis based on enhanced MRI for predicting an early therapeutic response to transcatheter arterial chemoembolisation combined with high-intensity focused ultrasound treatment in hepatocellular carcinoma. Clin Radiol 2018; 73: 758.e9-758. e18.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 25]  [Cited by in F6Publishing: 31]  [Article Influence: 5.2]  [Reference Citation Analysis (0)]
54.  Peng J, Kang S, Ning Z, Deng H, Shen J, Xu Y, Zhang J, Zhao W, Li X, Gong W, Huang J, Liu L. Residual convolutional neural network for predicting response of transarterial chemoembolization in hepatocellular carcinoma from CT imaging. Eur Radiol. 2020;30:413-424.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 69]  [Cited by in F6Publishing: 93]  [Article Influence: 18.6]  [Reference Citation Analysis (0)]
55.  Khan MJ, Werner CL, Darragh TM, Guido RS, Mathews C, Moscicki AB, Mitchell MM, Schiffman M, Wentzensen N, Massad LS, Mayeaux EJ Jr, Waxman AG, Conageski C, Einstein MH, Huh WK. ASCCP Colposcopy Standards: Role of Colposcopy, Benefits, Potential Harms, and Terminology for Colposcopic Practice. J Low Genit Tract Dis. 2017;21:223-229.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 50]  [Cited by in F6Publishing: 66]  [Article Influence: 11.0]  [Reference Citation Analysis (0)]
56.  Mayeaux EJ Jr, Novetsky AP, Chelmow D, Garcia F, Choma K, Liu AH, Papasozomenos T, Einstein MH, Massad LS, Wentzensen N, Waxman AG, Conageski C, Khan MJ, Huh WK. ASCCP Colposcopy Standards: Colposcopy Quality Improvement Recommendations for the United States. J Low Genit Tract Dis. 2017;21:242-248.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 15]  [Cited by in F6Publishing: 15]  [Article Influence: 2.5]  [Reference Citation Analysis (0)]
57.  Brown BH, Tidy JA. The diagnostic accuracy of colposcopy - A review of research methodology and impact on the outcomes of quality assurance. Eur J Obstet Gynecol Reprod Biol. 2019;240:182-186.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 16]  [Cited by in F6Publishing: 30]  [Article Influence: 6.0]  [Reference Citation Analysis (0)]
58.  Xue P, Ng MTA, Qiao Y. The challenges of colposcopy for cervical cancer screening in LMICs and solutions by artificial intelligence. BMC Med. 2020;18:169.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 37]  [Cited by in F6Publishing: 53]  [Article Influence: 13.3]  [Reference Citation Analysis (0)]
59.  Wolz R, Chu C, Misawa K, Fujiwara M, Mori K, Rueckert D. Automated abdominal multi-organ segmentation with subject-specific atlas generation. IEEE Trans Med Imaging. 2013;32:1723-1730.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 197]  [Cited by in F6Publishing: 133]  [Article Influence: 12.1]  [Reference Citation Analysis (0)]
60.  Summers RM, Elton DC, Lee S, Zhu Y, Liu J, Bagheri M, Sandfort V, Grayson PC, Mehta NN, Pinto PA, Linehan WM, Perez AA, Graffy PM, O'Connor SD, Pickhardt PJ. Atherosclerotic Plaque Burden on Abdominal CT: Automated Assessment With Deep Learning on Noncontrast and Contrast-enhanced Scans. Acad Radiol. 2020;.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 11]  [Cited by in F6Publishing: 16]  [Article Influence: 5.3]  [Reference Citation Analysis (0)]
61.  Wang G, Li W, Zuluaga MA, Pratt R, Patel PA, Aertsen M, Doel T, David AL, Deprest J, Ourselin S, Vercauteren T. Interactive Medical Image Segmentation Using Deep Learning With Image-Specific Fine Tuning. IEEE Trans Med Imaging. 2018;37:1562-1573.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 471]  [Cited by in F6Publishing: 256]  [Article Influence: 42.7]  [Reference Citation Analysis (0)]
62.  Boers TGW, Hu Y, Gibson E, Barratt DC, Bonmati E, Krdzalic J, van der Heijden F, Hermans JJ, Huisman HJ. Interactive 3D U-net for the segmentation of the pancreas in computed tomography scans. Phys Med Biol. 2020;65:065002.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 27]  [Cited by in F6Publishing: 17]  [Article Influence: 4.3]  [Reference Citation Analysis (0)]
63.  Halverson SJ, Kunju LP, Bhalla R, Gadzinski AJ, Alderman M, Miller DC, Montgomery JS, Weizer AZ, Wu A, Hafez KS, Wolf JS Jr. Accuracy of determining small renal mass management with risk stratified biopsies: confirmation by final pathology. J Urol. 2013;189:441-446.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 142]  [Cited by in F6Publishing: 158]  [Article Influence: 13.2]  [Reference Citation Analysis (0)]
64.  Dagher J, Delahunt B, Rioux-Leclercq N, Egevad L, Srigley JR, Coughlin G, Dunglinson N, Gianduzzo T, Kua B, Malone G, Martin B, Preston J, Pokorny M, Wood S, Yaxley J, Samaratunga H. Clear cell renal cell carcinoma: validation of World Health Organization/International Society of Urological Pathology grading. Histopathology. 2017;71:918-925.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 70]  [Cited by in F6Publishing: 87]  [Article Influence: 12.4]  [Reference Citation Analysis (0)]
65.  Delahunt B, Eble JN, Egevad L, Samaratunga H. Grading of renal cell carcinoma. Histopathology. 2019;74:4-17.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 75]  [Cited by in F6Publishing: 103]  [Article Influence: 20.6]  [Reference Citation Analysis (0)]
66.  Kim H, Inomoto C, Uchida T, Furuya H, Komiyama T, Kajiwara H, Kobayashi H, Nakamura N, Miyajima A. Verification of the International Society of Urological Pathology recommendations in Japanese patients with clear cell renal cell carcinoma. Int J Oncol. 2018;52:1139-1148.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 7]  [Cited by in F6Publishing: 13]  [Article Influence: 2.2]  [Reference Citation Analysis (0)]
67.  Zhao J, Zhang P, Chen X, Cao W, Ye Z. Lesion Size and Iodine Quantification to Distinguish Low-Grade From High-Grade Clear Cell Renal Cell Carcinoma Using Dual-Energy Spectral Computed Tomography. J Comput Assist Tomogr. 2016;40:673-677.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 9]  [Cited by in F6Publishing: 9]  [Article Influence: 1.3]  [Reference Citation Analysis (0)]
68.  Parada Villavicencio C, Mc Carthy RJ, Miller FH. Can diffusion-weighted magnetic resonance imaging of clear cell renal carcinoma predict low from high nuclear grade tumors. Abdom Radiol (NY). 2017;42:1241-1249.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 24]  [Cited by in F6Publishing: 26]  [Article Influence: 3.7]  [Reference Citation Analysis (0)]
69.  Aslan A, İnan İ, Aktan A, Ayaz E, Aslan M, Özkanlı SŞ, Yıldırım A, Yıkılmaz A. The utility of ADC measurement techniques for differentiation of low- and high-grade clear cell RCC. Pol J Radiol. 2018;83:e446-e451.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 8]  [Cited by in F6Publishing: 8]  [Article Influence: 1.3]  [Reference Citation Analysis (0)]
70.  Chen C, Kang Q, Xu B, Guo H, Wei Q, Wang T, Ye H, Wu X. Differentiation of low- and high-grade clear cell renal cell carcinoma: Tumor size versus CT perfusion parameters. Clin Imaging. 2017;46:14-19.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 14]  [Cited by in F6Publishing: 16]  [Article Influence: 2.3]  [Reference Citation Analysis (0)]
71.  Cui E, Li Z, Ma C, Li Q, Lei Y, Lan Y, Yu J, Zhou Z, Li R, Long W, Lin F. Predicting the ISUP grade of clear cell renal cell carcinoma with multiparametric MR and multiphase CT radiomics. Eur Radiol. 2020;30:2912-2921.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 31]  [Cited by in F6Publishing: 46]  [Article Influence: 11.5]  [Reference Citation Analysis (0)]