Minireviews Open Access
Copyright ©The Author(s) 2021. Published by Baishideng Publishing Group Inc. All rights reserved.
World J Gastroenterol. Apr 7, 2021; 27(13): 1283-1295
Published online Apr 7, 2021. doi: 10.3748/wjg.v27.i13.1283
Artificial intelligence for early detection of pancreatic adenocarcinoma: The future is promising
Antonio Mendoza Ladd, Department of Internal Medicine, Division of Gastroenterology, Texas Tech University Health Sciences Center El Paso, El Paso, TX 79905, United States
David L Diehl, Department of Gastroenterology and Nutrition, Geisinger Medical Center, Danville, PA 17822, United States
ORCID number: Antonio Mendoza Ladd (0000-0002-8013-3545); David L Diehl (0000-0003-4128-3839).
Author contributions: Both Mendoza Ladd A and Diehl D participated equally in the literature search and the drafting, editing and approval of the final manuscript.
Conflict-of-interest statement: Authors declare no conflict of interest exist for this manuscript.
Open-Access: This article is an open-access article that was selected by an in-house editor and fully peer-reviewed by external reviewers. It is distributed in accordance with the Creative Commons Attribution NonCommercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/Licenses/by-nc/4.0/
Corresponding author: Antonio Mendoza Ladd, FACG, FASGE, Assistant Professor, Department of Internal Medicine, Division of Gastroenterology, Texas Tech University Health Sciences Center El Paso, 4800 Alberta Avenue, El Paso, TX 79905, United States. dr_ladd25@yahoo.com
Received: December 31, 2020
Peer-review started: December 31, 2020
First decision: January 17, 2021
Revised: January 22, 2021
Accepted: March 13, 2021
Article in press: March 13, 2021
Published online: April 7, 2021

Abstract

Pancreatic ductal adenocarcinoma (PDAC) is a worldwide public health concern. Despite extensive research efforts toward improving diagnosis and treatment, the 5-year survival rate at best is approximately 15%. This dismal figure can be attributed to a variety of factors including lack of adequate screening methods, late symptom onset, and treatment resistance. Pancreatic ductal adenocarcinoma remains a grim diagnosis with a high mortality rate and a significant psy-chological burden for patients and their families. In recent years artificial intelligence (AI) has permeated the medical field at an accelerated pace, bringing potential new tools that carry the promise of improving diagnosis and treatment of a variety of diseases. In this review we will summarize the landscape of AI in diagnosis and treatment of PDAC.

Key Words: Pancreatic adenocarcinoma, Artificial intelligence, Neural network, Future perspectives, Early diagnosis, Improved performance

Core Tip: Pancreatic adenocarcinoma is one of the deadliest malignancies in the world. Several factors are responsible for this but delayed diagnosis is one of the most important. Despite improvements in diagnostic methods, early lesions are still missed in clinical practice. Artificial intelligence (AI)-assisted diagnostic methods have the potential of improving the clinical outcomes of these patients. However, major improvements in AI technology and its implementation need to occur before potential benefits can be attained.



INTRODUCTION

Current modalities for the diagnosis and treatment of pancreatic ductal adenocarcinoma (PDAC) remain disappointing. With an overall survival rate of 3%-15%, it is one of the deadliest malignancies in the world[1]. Although currently it is the 7th leading cause of cancer death worldwide, recent trends suggest that in North America and Europe PDAC will soon become 2nd and 3rd respectively[2,3]. Several factors contribute to the poor survival statistics of PDAC: Lack of adequate screening tests, delayed diagnosis, and sub-optimal treatment options. Consequently, improvements in all these areas are desperately needed.

Recent technological advances have led to the increased application of artificial intelligence (AI) in different disciplines. Because computers can store and analyze larger amounts of data than the human brain, AI has the potential to achieve unmet needs in medicine. Since improving outcomes for PDAC is an area of urgent need, this review will provide a summary of current and future applications of AI in the diagnosis and management of PDAC.

AI, MACHINE LEARNING, AND ARTIFICAL NEURAL NETWORKS

AI is a branch of computer science dedicated to developing models aimed at performing functions comparable to those accomplished by the human brain. Machine learning (ML) is the area of AI that deals with developing computer models capable of learning specific tasks through the repetition of calculations derived from large amounts of data[4]. These computer models analyze data through repetitive calculations using mathematical self-derived algorithms that are constantly adjusted until the model produces the desired outcome. Once the combination of adjustments necessary to achieve the outcome has been discovered, the computer then “learns” how to perform that specific task[5].

Broadly speaking, ML can be either supervised or unsupervised. The difference lies in whether the desired outcome of interest is previously known by the computer. In supervised learning, a computer is first introduced to a training dataset (the “input”) as well as the desired outcome of interest (the “output”). The computer then analyzes the input making the necessary adjustments to the algorithm until it consistently produces the desired output[6]. This type of learning requires large amounts of training data that has been pre-labeled (“curated”) by a human operator. Once the training of the machine is completed, a different dataset is used to test its performance (testing data). In unsupervised learning, the computer is introduced to unlabeled data. The machine then sorts it using the algorithm to identify features within it that can be grouped and analyzed further to reach a specific outcome[7]. Because the data in unsupervised ML is not curated, larger amounts of training data are required than for supervised ML.

To date, most of the ML used in medicine has been supervised; and it has been made possible due to the emergence of a relatively new discipline called radiomics. Radiomics studies the conversion of digital medical images into data that can be then subjected to statistical analysis[8]. Using computer technology, predefined quantitative features are extracted from computed tomography (CT), magnetic resonance Imaging (MRI) or other imaging modalities. An essential step prior to this data extraction however, is lesion “segmentation”. Segmentation is in simple terms, delineation of the lesion within an image. In supervised ML a human operator delineates the lesion prior to the algorithm operating on the data. In unsupervised ML, the algorithm learns how to segment the lesion of interest by itself.

Once the quantitative information of the lesion in the image is extracted, it is analyzed utilizing artificial neural networks (ANN). An ANN is a group of interconnected computers with a structure similar to a neural network in the human brain. Each computer represents a neuron or “node”, and each connection a synapse or “weight” (Figure 1). ANNs are organized in “layers” (groups of nodes) and the most basic model contains three: (1) Input layer (which receives the data), (2) Hidden layer (which performs the calculations and analyses), and (3) Output layer (which produces the final output)[9]. Deep neural networks contain more than one hidden layer and therefore can learn to analyze data with higher complexity levels; this is termed “deep learning”[10]. The different layers work in a hierarchical structure to produce the desired output.

Figure 1
Figure 1 Basic anatomy of an artificial neural network. Input layer, hidden layer (may have more than one) and output layer. All nodes are interconnected through wieghts (arrows).
AI-ASSISTED ANALYSIS OF ENDOSCOPIC ULTRASOUND IMAGES

Endoscopic ultrasound (EUS) is currently one of the most useful imaging modalities in the diagnosis of PDAC. The overall sensitivity of EUS guided biopsies for the diagnosis of PDAC reaches 98%, but its specificity can be as low as 20%[11]. The accuracy of EUS depends on both operator and lesion-related factors. The most important operator-related factor is the amount of experience performing EUS. On occasions, variations in gastroduodenal anatomy, (i.e. after partial gastric resection, or presence of a duodenal stricture) can significantly limit the ability of operator to visualize the pancreas. On the other hand, lesion-related factors include patient’s body habitus, the presence of acute inflammation (when EUS is done immediately after an episode of acute pancreatitis) or the presence of chronic pancreatitis (CP), particularly in the presence of parenchymal calcifications. Sensitivity of EUS for PDAC in the presence of CP can be as low as 54%[12,13]. In addition, CP and autoimmune pancreatitis can occasionally form pseudotumors, which may complicate image analysis by the endosonographer.

Several studies have reported on the application of AI in the analysis of EUS images of pancreatic diseases[14-22] (Table 1). For the most part, these studies have focused on evaluating the accuracy of ANNs in differentiating CP from PDAC. Norton et al[14] analyzed still EUS images previously selected by experts who did not perform the procedure and were blinded to the final diagnosis using an ANN. A total of 21 patients with PDAC and 14 with CP were included. Four features were analyzed in each image by the ANN, achieving an overall accuracy of 89%.

Table 1 Studies exploring artificial intelligence in the diagnosis of pancreatic ductal adenocarcinoma.
Ref.Study designData sourceAI instrumentPatientAimAccuracy
Norton et al[14], 2001RetrospectiveStandard EUSANN21PDAC vs CP89%
Ozkan et al[15], 2015RetrospectiveStandard EUSANN332PDAC vs Nl89%-92%
Zhang et al[16], 2010RetrospectiveStandard EUSANN216PDAC vs Nl98%
Das et al[17], 2008RetrospectiveStandard EUSANN56PDAC vs Nl vs CP93%
Zhu et al[18], 2013RetrospcectiveStandard EUSANN388PDAC vs CP94%
Săftoiu et al[20], 2012ProspectiveEUS w/ elastographyANN258PDAC vs CP91%
Săftoiu et al[21], 2008ProspectiveEUS w/elastographyANN68PDAC vs CP90%
Săftoiu et al[22], 2015ProspectiveEUS w/contrastANN167PDAC vs CP95%1
Fu et al[24], 2018RetrospectiveCTANN59Pancreatic tumor segmentation76%1
Chu et al[25], 2019RetrospectiveCTComputer derived forest algorithm380PDAC vs Nl99%
Liu et al[26], 2019RetrospectiveCTANN338PDAC vs Nl76%
Chu et al[29], 2019RetrospectiveCTANN456Segmentation of PDAC vs Nl94%
Devi et al[32], 2019RetrospectiveMRIANN168Nl vs Abnormal pancreas96%
Gao et al[33], 2020RetrospectiveMRIANN504Identify pancreatic disease77%
Liang et al[34], 2020RetrospectiveMRIANN27Segmentation of panc tumorsNot explicitly stated
Muhammad et al[42], 2019RetrospectiveClinical variablesANN800114PDAC prediction85%
Klein et al[43], 2013RetrospectiveClinical variablesComputer derived model7003PDAC risk61%
Hsieh et al[45], 2018RetrospectiveClinical variablesANN> 1 millionNOD predicting PDAC72%
Zhao et al[46], 2011RetrospectiveClinival variables + Pubmed dataBayesian network inferenceN/APDAC prediction85%
Sanoob et al[47], 2016RetrospectiveClinical variablesANN120PDAC detectionNot explicitly stated
Momeni-Boroujeni et al[56], 2017RetrospectiveFNA samplesANN75PDAC diagnosis77%
Bhasin et al[58], 2016RetrospectivePDAC genesComputer vector model52PDAC detection92%
Almeida et al[59], 2020RetrospectivePDAC genesANN402PDAC detection86%

In a similar study, Das et al[17] retrospectively analyzed the performance of an ANN in differentiating PDAC from normal pancreas and CP. A total of 56 patients (22 normal, 12 CP and 22 PDAC) were studied. Their AI algorithm identified PDAC with an area under the curve of 0.93. The differences in accuracy in this study may have been secondary to the more stringent criteria in the definition of CP and the higher number of image features analyzed by the ANN. A larger study performed by Zhu et al[18], analyzed 262 patients with PDAC and 126 with CP and reported that their algorithm reached an overall accuracy of 94%.

Elastography is an imaging method developed to establish the differences in consistency (“strain”) between normal and abnormal tissue during EUS. Such differences are portrayed in a color-coded overlay on the EUS image, with red correlated with softer tissue and blue with harder[19]. A multicenter study reported an overall accuracy, sensitivity and specificity of 84%, 88% and 83% respectively in differentiating PDAC from CP[20]. This study used ANN analysis of histograms from elastography images previously selected by experts blinded to the patients’ diagnoses. Another similar small study of 68 patients reported an accuracy rate of 90%[21].

Contrast agents have also been developed to aid in the differentiation between PDAC and CP. Săftoiu et al[22] reported that ANN analysis of contrast-enhanced EUS images could establish a difference with an area under the curve of 94%. Few studies have focused on using standard EUS B mode images to diagnose pancreatic tumors in the absence of CP. These have reported accuracies up to 99%[15,16].

AI-ASSISTED ANALYSIS OF COMPUTERIZED TOMOGRAPHY IMAGES

Computerized tomography (CT) is perhaps the most common medical imaging modality being explored with AI. The analysis of CT images of neoplastic lesions involves three main steps: detection, characterization and monitoring of change over time[23]. Most of the available studies have focused on AI-assisted characterization of lesions, which is equivalent to the previously defined concept of segmentation. Three studies have applied AI to the analysis of CT images in PDAC for diagnostic purposes[24-26]. In a small study of 15 healthy patients and 44 with a variety of pancreatic tumors, Fu et al[24] reported that their algorithm achieved an overall sensitivity of 76%. In a retrospective case control study of 380 patients (190 cases and 190 controls) Chu et al[25] reported an accuracy of 99% in their computer-derived algorithm. Meanwhile, a prospective study by Liu et al[26] reported an accuracy of 76%. AI has also been utilized to establish correlations between the CT images of PDAC and their subsequent biological behaviors[27,28].

Two important ongoing projects are worth mentioning. The Felix Project funded by the Lustgarten Foundation is a multidisciplinary study carried out by a group at Johns Hopkins University. Using deep learning computer models with manually segmented images from 156 PDAC cases and 300 normal controls, the group reported a sensitivity and specificity of 94% and 99% respectively in their initial report[29]. Based on these encouraging results, the group’s next step is to expand their analysis to 575 normal and 750 PDAC patients. The group’s ultimate goal is to fine tune the performance of their algorithm prior to expanding it to larger externally derived datasets.

The second ongoing study is being conducted by the Alliance of Pancreatic Cancer Consortium Imaging Working Group (2). Their objective is to collect pre- and post- diagnosis CT, magnetic resonance imaging (MRI) and transabdominal ultrasound images from patients ultimately diagnosed with PDAC. These images will be used to create a repository that will later be shared and analyzed. The ultimate goal of the project is to develop AI models that can predict the appearance of PDAC and diagnose the disease in its early stages.

AI-ASSISTED ANALYSIS OF MAGNETIC RESONANCE IMAGES

Segmentation of MRI images by AI has been reported to be more technically challenging than CT images[30,31]. A few studies have reported that ML models can be trained to accurately identify pathology in the pancreas, although the literature on PDAC is scarce. Devi et al[32] reported that an ANN could accurately identify a variety of abnormal pancreatic findings with a 96% accuracy. Similarly, Gao et al[33] reported that an AI model differentiated normal from abnormal pancreas at a level comparable to humans (77% vs 82% respectively). In the only published study aimed specifically at identifying PDAC in MR images, Liang et al[34] reported that a convoluted neural network (a variety of ANN) performed similarly to humans in identifying the lesion.

Artificial Intelligence processing of MRI images has been also applied in the context of PDAC therapy. Spieler et al[35] reported that an ANN accurately delineated pancreatic tumors prior to radiotherapy. Zhao et al[36] reported similarly positive results. In a study utilizing AI aiming to automatically calculate the dose of stereotactic body radiation therapy, Campbell et al[37] demonstrated that an ANN-calculated dose was comparable to a human-calculated one. By applying radiomics to MRI images, investigators have reported that their quantitative data can be correlated with aspects such as tumor subtype, survival and response to chemotherapy[38,39]. This same technology has made possible other studies showing that MR images data can be used to predict relapse after PDAC treatment[40,41].

AI ANALYSIS OF CLINICAL DATABASES

The evolution of AI has resulted in models with the capacity of analyzing data beyond quantitative image features. This type of ML requires a higher complexity in the architecture of the neural networks given the broad range of variables analyzed at any given point in time. These models have been applied in the development of algorithms that can accurately identify patients with or at risk of developing PDAC based on several clinical variables.

Muhammad et al[42] utilized an ANN to analyze a large patient population derived from the National Health Interview Survey and the Prostate, Lung, Colorectal and Ovarian trial. The authors developed and trained the ANN with > 800000 patients of which 898 had PDAC. Analyzing variables such as demographics, comorbidities, race and family history, the model predicted the development of PDAC with an AUC of 0.85. In a similar manner, albeit with a lower accuracy, Klein et al[43] utilized data from the PanScan Consortium to develop a model that predicted high risk of PDAC among patients of European ancestry with an AUC of 0.61.

New onset diabetes has been adopted as a marker for patients with high risk of developing PDAC within the following 3 years[44]. As such, it has been the subject of analysis by AI techniques. Hsieh et al[45] compared the PDAC prediction accuracy of new onset diabetes when analyzed by a regular logistic regression or an ANN. A total of 3092 PDAC cases were identified from a population of > 1000000 patients. Interestingly, the logistic regression slightly outperformed the ANN (AUROC 0.7 and 0.64% respectively). In a complex study combining PubMed data and clinical information, Zhao et al[46] utilized an innovative weighted Bayesian network that accurately predicted PDAC with an AUROC of 0.91. In a similar but simpler study, Sanoob et al[47] reported that an ANN can accurately diagnose PDAC based on a combination of signs and symptoms[47]. ANNs that analyze clinical data have also been used in determining patient survival and performance after PDAC treatment[48,49].

AI-ASSISTED ANALYSIS OF PATHOLOGICAL AND MOLECULAR FEATURES OF PDAC

Currently, pathologists are responsible for interpretation of histology specimens, and this process is dependent on their previous training, level of experience and individual skills. In an attempt to standardize interpretation and reduce human bias, AI techniques have been applied to pathology specimen analysis[50].

Application of AI in pathology is dependent on creation of a high-resolution digital image from the glass slide. This step is called “whole slide imaging” (WSI). WSI is accomplished by use of glass slide scanners and the technology to support them[51]. WSI and the necessary IT infrastructure to support its clinical use is referred to as “digital pathology”. It is expected that eventually, pathology workflow will move away from pathologists looking at glass slides through a microscope to pathologists reviewing digital images of slides on high resolution computer screens. Food and Drug Administration clearance for these devices is relatively new (2017), and widespread adoption of digital pathology technology is still in its early phases. There is an increasing amount of data on validation of WSI compared to microscope viewing of glass slides, for example a recent study showed good concordance of slide interpretation of frozen sections done with glass slides/microscope compared to WSI/digital pathology workstation[52] Similar excellent intraobserver concordance between glass slides and digital pathology has been shown for routine clinical workload in surgical pathology[53].

There is already significant work and available commercial devices that can bring AI computing power to aid in the interpretation and screening of biopsies, although commercial expansion of this is in its infancy[54]. Much of the work regarding AI in pathology concerns prostate and breast malignancy, since the incidence of these malignancies is fairly high, and the clinical need and potential commercial applications present a more attractive corporate opportunity for device sales. However, with continued digitization of glass slides, particularly of pancreatic malignancy (both FNA specimens and surgical pathology) it is hoped that an enlarging curated group of cases can serve as a training set for AI analysis regarding pancreatic cancer.

There has been only a limited number of investigations of AI in pathology for the diagnosis of pancreatic cancer. Although the sensitivity and specificity of EUS guided FNA samples is in average > 90%[55], some specimens still fall under the “atypical cells” category, posing a considerable diagnostic dilemma. In a recent study, Momeni-Boroujeni et al[56] studied the performance of an ANN in reclassifying EUS-FNA specimens originally labeled as “atypical” by pathologists. Among a group of 31 patients in whom the final diagnosis had been previously established by other diagnostic methods, the ANN’s overall accuracy for adequately reclassifying the specimen as malignant or benign was 77%.

Two of the main factors driving the high mortality of PDAC are suboptimal understanding of its malignant behavior and its unpredictable treatment response rate. Advances in AI-assisted genetic and molecular profiling of PDAC have recently broadened insight on these factors[57]. Recent data showed that early diagnosis could be possible through AI analysis of the transcription products of certain PDAC genes. These studies have reported sensitivity and specificity ranging from 88%-95% and 83%-95% respectively[58,59]. AI has also been utilized to match PDAC biological information with chemical properties of specific drugs in order to develop models capable of predicting response to these specific agents[60,61].

FUTURE CONSIDERATIONS

Technological advances in the last 50 years have exponentially increased the amount and quality of data available for medical decision making. Hence, it is becoming increasingly evident that new methods for storage and analysis of it are necessary. Although the concept of AI or its applications in medicine may still seem foreign for most practitioners, it is rapidly positioning itself as an indispensable tool to reduce human error. As sophisticated and elegant our diagnostic and therapeutic capacities may be currently, they remain inevitably limited by our subconscious and conscious bias, as well as our wide range of intellectual and technical skills. William Osler’s famous quote: “medicine is a science of uncertainty and an art of probability” will likely never be proven false.However the degree of uncertainty and probability considered tolerable in modern medicine is constantly shrinking. The advent of AI brings, in theory, the promise of reducing and even eliminating these shortcomings

Nevertheless, this promise is one that needs to be taken cautiously. There are several hurdles that must be overcome before AI can see widespread adoption in medical care. One limitation is the current lack of adequate standardization. Uniform protocols for data collection, processing, storage, reproduction and analysis must be established and standardized. Furthermore, different types of data may require different AI technologies. For example, ANNs trained to adequately classify histologic slides of pancreatic biopsies which have been fixed and stained with a specific method, may underperform, or not perform at all, when presented with slides prepared in a different manner. Creating such universal protocols, although possible, will be laborious and expensive.

Another concern is with the ethical handling of information. AI systems require vast amounts of data, and therefore, its implementation demands reliable methods of patient data de-identification. This is indispensable to ensure patient confidentiality, since one of the pillars of AI is data sharing. On the other hand, de-identified data needs to maintain its traceability, in order to allow individual practitioners to retrieve it and make the necessary decisions at the bedside. Three different models have been developed for data sharing in AI, and all have their advantages and dis-advantages[2,62-65].

Centralized models require sharing of large amounts of data by different sources (i.e institutions). This data is uploaded into a central server that carries out the algorithmic adjustments. Once trained, the central server shares the finalized algorithm with the individual sources for internal use. The main drawback of this model is that centralization of the information in the server may increase the risk of a security breach, as the individual source no longer controls the information (Figure 2). In distributed or federated models, each source develops and adjusts its own algorithm with internal data. Once each source has fine-tuned their algorithm, they share its parameters with a central server. The central server then utilizes all the individual parameters to develop a centralized algorithm that later gets returned to the source for internal use (Figure 3). The main advantage of this model is that data is not shared with the central server. In hybrid models features of both models are present. A data repository is created to be shared by both data providers and the central server. The data repository then develops an algorithm using each individual institution’s data before it gets sent to the central server. The central server then updates the master algorithm before it gets sent back to the data repository (Figure 4).

Figure 2
Figure 2 Centralized artificial intelligence information sharing system. Each individual institution provides data to the central server. The server analyzes all the data and develops and algorithm that is sent to each institution. This algorithm is then used by each institution to analyze its own internal data in the future.
Figure 3
Figure 3 Federated artificial intelligence information sharing system. Each individual institution develops its own algorithm with internal data. Once the algorihms are developed, their parameters are shared with the central server. The server then develops a master algorithm using all the individual parameters. The master algorighm is sent back to the institutions for its internal use.
Figure 4
Figure 4 Hybrid artificial intelligence infromation sharing system. A data repository is created as an intermediary between the institutions and the central server. The data repository develops one or more algorithm(s) with the data. The parameters of these algorithms are then shared with the central server to create a master algorithm which is then returned to the repository. New data coming from the institutions is then used by the repository to create new parameters that are then sent to the central server to renew the master algorithm.

The quality of data currently utilized in AI needs to be improved. Most of the AI systems used in data analysis so far have been trained and tested with rather small datasets originating from within local institutions. This raises the issue of information bias. These datasets lack the degree of diversity necessary to mirror the scenarios human providers face during routine clinical practice. For AI systems to perform adequately, the datasets need to be sufficiently diverse in all the possible variables that come into play when making clinical decisions (demographics, medical and/or family history, physical and laboratory findings and others). Therefore, datasets need to originate from a variety of sources for them to be representative and inclusive, not from a limited number of large academic medical centers or research institutions.

Another shortcoming is the fact that the average ANN functions as a “black box”[66]. As such, how a specific variable in a dataset is weighted by specific nodes in the network is currently uninterpretable. When evaluating the performance of any ANN, clinicians, mathematicians and computer scientists need to understand the “reasoning” that occurs within the hidden layers. Although questioning of ANNs is possible through mathematical reasoning, it does not reflect clinical decision making. Understanding the way in which ANNs analyze information is paramount for improving their performance and correcting errors that can lead to fatal consequences.

Finally, much of the promise of applying AI in medical image analysis depends on the ability to get actionable results rapidly. The current need for multiple intricate post-processing steps prior to its analysis, indicates that much more work must be done to develop this into a technology that provides “on-the-fly“ results[63].

Because of the limitations enumerated above, it is clear that major improvements to the technology need to occur before AI can support everyday activities in clinical practice. It is evident however, that medicine has reached a “point of no return” regarding application of AI. Overcoming these hurdles will require collaboration between academic centers, industry, computer scientists, venture capitalists, regulating organizations and governments (Figure 5). The main goals of this collaboration should be streamlining development of standard data platforms, lowering technology costs and making the technology more “user friendly” so that any provider can use it in real time.

Figure 5
Figure 5 Collaboration to expedite broad artificial intelligence application in medicine. AI: Artificial intelligence.
CONCLUSION

With further research, AI could have a large impact on the diagnosis and treatment of PDAC in the future. Novel screening methods are needed, and AI analysis of large comprehensive clinical datasets may yield opportunity for early detection or even predict development of PDAC before a visible lesion can be seen on imaging. An AI protocol which prescreens computed tomography or magnetic resonance imaging prior to a radiologist reading the study could ensure that lesions will not be missed due to human error. “On the fly” AI assistance with endoscopic ultrasound imaging could help the endosonographer optimally target a needle biopsy of a mass. The pathologist can be assisted by an AI algorithm that analyzes the biopsy on the slide that is being read. And AI could prove very useful for following response to treatment and even in suggesting optimal treatment regimens, with personalized treatment strategies based on biological profiling. While AI applications in PDAC are still in the very early stage of development, further investment in research could lead to substantial improvements in screening, early diagnosis, and treatment.

Footnotes

Manuscript source: Invited manuscript

Corresponding Author's Membership in Professional Societies: American Society for Gastrointestinal Endoscopy, No. 136083; American College of Gastroenterology, No. 35849.

Specialty type: Gastroenterology and hepatology

Country/Territory of origin: United States

Peer-review report’s scientific quality classification

Grade A (Excellent): 0

Grade B (Very good): B

Grade C (Good): C, C

Grade D (Fair): 0

Grade E (Poor): 0

P-Reviewer: Caputo D, Gkolfakis P, Kosuga T S-Editor: Zhang L L-Editor: A P-Editor: Ma YJ

References
1.  Arnold M, Rutherford MJ, Bardot A, Ferlay J, Andersson TM, Myklebust TÅ, Tervonen H, Thursfield V, Ransom D, Shack L, Woods RR, Turner D, Leonfellner S, Ryan S, Saint-Jacques N, De P, McClure C, Ramanakumar AV, Stuart-Panko H, Engholm G, Walsh PM, Jackson C, Vernon S, Morgan E, Gavin A, Morrison DS, Huws DW, Porter G, Butler J, Bryant H, Currow DC, Hiom S, Parkin DM, Sasieni P, Lambert PC, Møller B, Soerjomataram I, Bray F. Progress in cancer survival, mortality, and incidence in seven high-income countries 1995-2014 (ICBP SURVMARK-2): a population-based study. Lancet Oncol. 2019;20:1493-1505.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 599]  [Cited by in F6Publishing: 558]  [Article Influence: 111.6]  [Reference Citation Analysis (0)]
2.  Young MR, Abrams N, Ghosh S, Rinaudo JAS, Marquez G, Srivastava S. Prediagnostic Image Data, Artificial Intelligence, and Pancreatic Cancer: A Tell-Tale Sign to Early Detection. Pancreas. 2020;49:882-886.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 10]  [Cited by in F6Publishing: 9]  [Article Influence: 2.3]  [Reference Citation Analysis (0)]
3.  Bray F, Ferlay J, Soerjomataram I, Siegel RL, Torre LA, Jemal A. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin. 2018;68:394-424.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 53206]  [Cited by in F6Publishing: 50898]  [Article Influence: 8483.0]  [Reference Citation Analysis (44)]
4.  Tang A, Tam R, Cadrin-Chênevert A, Guest W, Chong J, Barfett J, Chepelev L, Cairns R, Mitchell JR, Cicero MD, Poudrette MG, Jaremko JL, Reinhold C, Gallix B, Gray B, Geis R;  Canadian Association of Radiologists (CAR) Artificial Intelligence Working Group. Canadian Association of Radiologists White Paper on Artificial Intelligence in Radiology. Can Assoc Radiol J. 2018;69:120-135.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 223]  [Cited by in F6Publishing: 238]  [Article Influence: 39.7]  [Reference Citation Analysis (0)]
5.  Nwanganga F, Chapple M.   What Is Machine Learning? 2020.  [PubMed]  [DOI]  [Cited in This Article: ]
6.  Shalev-Shwartz S, Ben-David S.   Understanding machine learning: From theory to algorithms: Cambridge university press, 2014 [cited 10 March 2021]. Available from: http://www.cs.huji.ac.il/~shais/UnderstandingMachineLearning/copy.html.  [PubMed]  [DOI]  [Cited in This Article: ]
7.  Le Berre C, Sandborn WJ, Aridhi S, Devignes MD, Fournier L, Smaïl-Tabbone M, Danese S, Peyrin-Biroulet L. Application of Artificial Intelligence to Gastroenterology and Hepatology. Gastroenterology 2020; 158: 76-94. e2.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 230]  [Cited by in F6Publishing: 259]  [Article Influence: 64.8]  [Reference Citation Analysis (0)]
8.  Gillies RJ, Kinahan PE, Hricak H. Radiomics: Images Are More than Pictures, They Are Data. Radiology. 2016;278:563-577.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 4541]  [Cited by in F6Publishing: 4480]  [Article Influence: 560.0]  [Reference Citation Analysis (2)]
9.  Shahid N, Rappon T, Berta W. Applications of artificial neural networks in health care organizational decision-making: A scoping review. PLoS One. 2019;14:e0212356.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 186]  [Cited by in F6Publishing: 116]  [Article Influence: 23.2]  [Reference Citation Analysis (0)]
10.  LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015;521:436-444.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 36149]  [Cited by in F6Publishing: 17282]  [Article Influence: 1920.2]  [Reference Citation Analysis (0)]
11.  Kitano M, Yoshida T, Itonaga M, Tamura T, Hatamaru K, Yamashita Y. Impact of endoscopic ultrasonography on diagnosis of pancreatic cancer. J Gastroenterol. 2019;54:19-32.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 170]  [Cited by in F6Publishing: 174]  [Article Influence: 34.8]  [Reference Citation Analysis (0)]
12.  Krishna NB, Mehra M, Reddy AV, Agarwal B. EUS/EUS-FNA for suspected pancreatic cancer: influence of chronic pancreatitis and clinical presentation with or without obstructive jaundice on performance characteristics. Gastrointest Endosc. 2009;70:70-79.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 65]  [Cited by in F6Publishing: 73]  [Article Influence: 4.9]  [Reference Citation Analysis (0)]
13.  Fritscher-Ravens A, Brand L, Knöfel WT, Bobrowski C, Topalidis T, Thonke F, de Werth A, Soehendra N. Comparison of endoscopic ultrasound-guided fine needle aspiration for focal pancreatic lesions in patients with normal parenchyma and chronic pancreatitis. Am J Gastroenterol. 2002;97:2768-2775.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 196]  [Cited by in F6Publishing: 210]  [Article Influence: 9.5]  [Reference Citation Analysis (0)]
14.  Norton ID, Zheng Y, Wiersema MS, Greenleaf J, Clain JE, Dimagno EP. Neural network analysis of EUS images to differentiate between pancreatic malignancy and pancreatitis. Gastrointest Endosc. 2001;54:625-629.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 71]  [Cited by in F6Publishing: 69]  [Article Influence: 3.0]  [Reference Citation Analysis (0)]
15.  Ozkan M, Cakiroglu M, Kocaman O, Kurt M, Yilmaz B, Can G, Korkmaz U, Dandil E, Eksi Z. Age-based computer-aided diagnosis approach for pancreatic cancer on endoscopic ultrasound images. Endosc Ultrasound. 2016;5:101-107.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 36]  [Cited by in F6Publishing: 54]  [Article Influence: 6.8]  [Reference Citation Analysis (0)]
16.  Zhang MM, Yang H, Jin ZD, Yu JG, Cai ZY, Li ZS. Differential diagnosis of pancreatic cancer from normal tissue with digital imaging processing and pattern recognition based on a support vector machine of EUS images. Gastrointest Endosc. 2010;72:978-985.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 54]  [Cited by in F6Publishing: 52]  [Article Influence: 3.7]  [Reference Citation Analysis (0)]
17.  Das A, Nguyen CC, Li F, Li B. Digital image analysis of EUS images accurately differentiates pancreatic cancer from chronic pancreatitis and normal tissue. Gastrointest Endosc. 2008;67:861-867.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 69]  [Cited by in F6Publishing: 72]  [Article Influence: 4.5]  [Reference Citation Analysis (0)]
18.  Zhu M, Xu C, Yu J, Wu Y, Li C, Zhang M, Jin Z, Li Z. Differentiation of pancreatic cancer and chronic pancreatitis using computer-aided diagnosis of endoscopic ultrasound (EUS) images: a diagnostic test. PLoS One. 2013;8:e63820.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 54]  [Cited by in F6Publishing: 65]  [Article Influence: 5.9]  [Reference Citation Analysis (2)]
19.  Giovannini M, Hookey LC, Bories E, Pesenti C, Monges G, Delpero JR. Endoscopic ultrasound elastography: the first step towards virtual biopsy? Endoscopy. 2006;38:344-348.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 215]  [Cited by in F6Publishing: 202]  [Article Influence: 11.2]  [Reference Citation Analysis (0)]
20.  Săftoiu A, Vilmann P, Gorunescu F, Janssen J, Hocke M, Larsen M, Iglesias-Garcia J, Arcidiacono P, Will U, Giovannini M, Dietrich CF, Havre R, Gheorghe C, McKay C, Gheonea DI, Ciurea T;  European EUS Elastography Multicentric Study Group. Efficacy of an artificial neural network-based approach to endoscopic ultrasound elastography in diagnosis of focal pancreatic masses. Clin Gastroenterol Hepatol 2012; 10: 84-90. e1.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 123]  [Cited by in F6Publishing: 116]  [Article Influence: 9.7]  [Reference Citation Analysis (0)]
21.  Săftoiu A, Vilmann P, Gorunescu F, Gheonea DI, Gorunescu M, Ciurea T, Popescu GL, Iordache A, Hassan H, Iordache S. Neural network analysis of dynamic sequences of EUS elastography used for the differential diagnosis of chronic pancreatitis and pancreatic cancer. Gastrointest Endosc. 2008;68:1086-1094.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 169]  [Cited by in F6Publishing: 185]  [Article Influence: 11.6]  [Reference Citation Analysis (0)]
22.  Săftoiu A, Vilmann P, Dietrich CF, Iglesias-Garcia J, Hocke M, Seicean A, Ignee A, Hassan H, Streba CT, Ioncică AM, Gheonea DI, Ciurea T. Quantitative contrast-enhanced harmonic EUS in differential diagnosis of focal pancreatic masses (with videos). Gastrointest Endosc. 2015;82:59-69.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 94]  [Cited by in F6Publishing: 94]  [Article Influence: 10.4]  [Reference Citation Analysis (0)]
23.  Hosny A, Parmar C, Quackenbush J, Schwartz LH, Aerts HJWL. Artificial intelligence in radiology. Nat Rev Cancer. 2018;18:500-510.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1552]  [Cited by in F6Publishing: 1365]  [Article Influence: 227.5]  [Reference Citation Analysis (2)]
24.  Fu M, Wu W, Hong X, Liu Q, Jiang J, Ou Y, Zhao Y, Gong X. Hierarchical combinatorial deep learning architecture for pancreas segmentation of medical computed tomography cancer images. BMC Syst Biol. 2018;12:56.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 49]  [Cited by in F6Publishing: 33]  [Article Influence: 5.5]  [Reference Citation Analysis (0)]
25.  Chu LC, Park S, Kawamoto S, Fouladi DF, Shayesteh S, Zinreich ES, Graves JS, Horton KM, Hruban RH, Yuille AL, Kinzler KW, Vogelstein B, Fishman EK. Utility of CT Radiomics Features in Differentiation of Pancreatic Ductal Adenocarcinoma From Normal Pancreatic Tissue. AJR Am J Roentgenol. 2019;213:349-357.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 73]  [Cited by in F6Publishing: 91]  [Article Influence: 18.2]  [Reference Citation Analysis (0)]
26.  Liu SL, Li S, Guo YT, Zhou YP, Zhang ZD, Lu Y. Establishment and application of an artificial intelligence diagnosis system for pancreatic cancer with a faster region-based convolutional neural network. Chin Med J (Engl). 2019;132:2795-2803.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 54]  [Cited by in F6Publishing: 43]  [Article Influence: 8.6]  [Reference Citation Analysis (0)]
27.  Koay EJ, Lee Y, Cristini V, Lowengrub JS, Kang Y, Lucas FAS, Hobbs BP, Ye R, Elganainy D, Almahariq M, Amer AM, Chatterjee D, Yan H, Park PC, Rios Perez MV, Li D, Garg N, Reiss KA, Yu S, Chauhan A, Zaid M, Nikzad N, Wolff RA, Javle M, Varadhachary GR, Shroff RT, Das P, Lee JE, Ferrari M, Maitra A, Taniguchi CM, Kim MP, Crane CH, Katz MH, Wang H, Bhosale P, Tamm EP, Fleming JB. A Visually Apparent and Quantifiable CT Imaging Feature Identifies Biophysical Subtypes of Pancreatic Ductal Adenocarcinoma. Clin Cancer Res. 2018;24:5883-5894.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 56]  [Cited by in F6Publishing: 62]  [Article Influence: 10.3]  [Reference Citation Analysis (0)]
28.  Qiu W, Duan N, Chen X, Ren S, Zhang Y, Wang Z, Chen R. Pancreatic Ductal Adenocarcinoma: Machine Learning-Based Quantitative Computed Tomography Texture Analysis For Prediction Of Histopathological Grade. Cancer Manag Res. 2019;11:9253-9264.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 16]  [Cited by in F6Publishing: 17]  [Article Influence: 3.4]  [Reference Citation Analysis (0)]
29.  Chu LC, Park S, Kawamoto S, Wang Y, Zhou Y, Shen W, Zhu Z, Xia Y, Xie L, Liu F, Yu Q, Fouladi DF, Shayesteh S, Zinreich E, Graves JS, Horton KM, Yuille AL, Hruban RH, Kinzler KW, Vogelstein B, Fishman EK. Application of Deep Learning to Pancreatic Cancer Detection: Lessons Learned From Our Initial Experience. J Am Coll Radiol. 2019;16:1338-1342.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 41]  [Cited by in F6Publishing: 42]  [Article Influence: 10.5]  [Reference Citation Analysis (0)]
30.  Bobo MF, Bao S, Huo Y, Yao Y, Virostko J, Plassard AJ, Lyu I, Assad A, Abramson RG, Hilmes MA, Landman BA. Fully Convolutional Neural Networks Improve Abdominal Organ Segmentation. Proc SPIE Int Soc Opt Eng. 2018;10574.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 18]  [Cited by in F6Publishing: 22]  [Article Influence: 3.7]  [Reference Citation Analysis (0)]
31.  Shen J, Baum T, Cordes C, Ott B, Skurk T, Kooijman H, Rummeny EJ, Hauner H, Menze BH, Karampinos DC. Automatic segmentation of abdominal organs and adipose tissue compartments in water-fat MRI: Application to weight-loss in obesity. Eur J Radiol. 2016;85:1613-1621.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 23]  [Cited by in F6Publishing: 25]  [Article Influence: 3.1]  [Reference Citation Analysis (0)]
32.  Devi BA, Rajasekaran MP.   Performance evaluation of MRI pancreas image classification using artificial neural network (ANN). Smart Intelligent Computing and Applications: Springer, 2019: 671-681.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 3]  [Cited by in F6Publishing: 3]  [Article Influence: 0.6]  [Reference Citation Analysis (0)]
33.  Gao X, Wang X. Performance of deep learning for differentiating pancreatic diseases on contrast-enhanced magnetic resonance imaging: A preliminary study. Diagn Interv Imaging. 2020;101:91-100.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 15]  [Cited by in F6Publishing: 15]  [Article Influence: 3.0]  [Reference Citation Analysis (0)]
34.  Liang Y, Schott D, Zhang Y, Wang Z, Nasief H, Paulson E, Hall W, Knechtges P, Erickson B, Li XA. Auto-segmentation of pancreatic tumor in multi-parametric MRI using deep convolutional neural networks. Radiother Oncol. 2020;145:193-200.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 27]  [Cited by in F6Publishing: 48]  [Article Influence: 12.0]  [Reference Citation Analysis (0)]
35.  Spieler B, Patel N, Breto A, Ford J, Stoyanova R, Zavala-Romero O, Mellon E, Portelance L. Automatic segmentation of abdominal anatomy by artificial intelligence (AI) in adaptive radiotherapy of pancreatic cancer. INT J Radiat Oncol. 2019;105:E130-E131.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 2]  [Cited by in F6Publishing: 3]  [Article Influence: 0.6]  [Reference Citation Analysis (0)]
36.  Zhao W, Shen L, Han B, Yang Y, Cheng K, Toesca DAS, Koong AC, Chang DT, Xing L. Markerless Pancreatic Tumor Target Localization Enabled By Deep Learning. Int J Radiat Oncol Biol Phys. 2019;105:432-439.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 23]  [Cited by in F6Publishing: 29]  [Article Influence: 5.8]  [Reference Citation Analysis (0)]
37.  Campbell WG, Miften M, Olsen L, Stumpf P, Schefter T, Goodman KA, Jones BL. Neural network dose models for knowledge-based planning in pancreatic SBRT. Med Phys. 2017;44:6148-6158.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 37]  [Cited by in F6Publishing: 43]  [Article Influence: 6.1]  [Reference Citation Analysis (0)]
38.  Kaissis G, Ziegelmayer S, Lohöfer F, Steiger K, Algül H, Muckenhuber A, Yen HY, Rummeny E, Friess H, Schmid R, Weichert W, Siveke JT, Braren R. A machine learning algorithm predicts molecular subtypes in pancreatic ductal adenocarcinoma with differential response to gemcitabine-based vs FOLFIRINOX chemotherapy. PLoS One. 2019;14:e0218642.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 43]  [Cited by in F6Publishing: 39]  [Article Influence: 7.8]  [Reference Citation Analysis (0)]
39.  Kaissis G, Ziegelmayer S, Lohöfer F, Algül H, Eiber M, Weichert W, Schmid R, Friess H, Rummeny E, Ankerst D, Siveke J, Braren R. A machine learning model for the prediction of survival and tumor subtype in pancreatic ductal adenocarcinoma from preoperative diffusion-weighted imaging. Eur Radiol Exp. 2019;3:41.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 42]  [Cited by in F6Publishing: 46]  [Article Influence: 9.2]  [Reference Citation Analysis (0)]
40.  Sala Elarre P, Oyaga-Iriarte E, Yu KH, Baudin V, Arbea Moreno L, Carranza O, Chopitea Ortega A, Ponz-Sarvise M, Mejías Sosa LD, Rotellar Sastre F, Larrea Leoz B, Iragorri Barberena Y, Subtil Iñigo JC, Benito Boíllos A, Pardo F, Rodríguez Rodríguez J. Use of Machine-Learning Algorithms in Intensified Preoperative Therapy of Pancreatic Cancer to Predict Individual Risk of Relapse. Cancers (Basel). 2019;11.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 13]  [Cited by in F6Publishing: 13]  [Article Influence: 2.6]  [Reference Citation Analysis (0)]
41.  Tang TY, Li X, Zhang Q, Guo CX, Zhang XZ, Lao MY, Shen YN, Xiao WB, Ying SH, Sun K, Yu RS, Gao SL, Que RS, Chen W, Huang DB, Pang PP, Bai XL, Liang TB. Development of a Novel Multiparametric MRI Radiomic Nomogram for Preoperative Evaluation of Early Recurrence in Resectable Pancreatic Cancer. J Magn Reson Imaging. 2020;52:231-245.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 36]  [Cited by in F6Publishing: 41]  [Article Influence: 8.2]  [Reference Citation Analysis (0)]
42.  Muhammad W, Hart GR, Nartowt B, Farrell JJ, Johung K, Liang Y, Deng J.   Pancreatic cancer prediction through an artificial neural network. Frontiers in Artificial Intelligence 2019; 2: 2.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 91]  [Cited by in F6Publishing: 54]  [Article Influence: 10.8]  [Reference Citation Analysis (0)]
43.  Klein AP, Lindström S, Mendelsohn JB, Steplowski E, Arslan AA, Bueno-de-Mesquita HB, Fuchs CS, Gallinger S, Gross M, Helzlsouer K, Holly EA, Jacobs EJ, Lacroix A, Li D, Mandelson MT, Olson SH, Petersen GM, Risch HA, Stolzenberg-Solomon RZ, Zheng W, Amundadottir L, Albanes D, Allen NE, Bamlet WR, Boutron-Ruault MC, Buring JE, Bracci PM, Canzian F, Clipp S, Cotterchio M, Duell EJ, Elena J, Gaziano JM, Giovannucci EL, Goggins M, Hallmans G, Hassan M, Hutchinson A, Hunter DJ, Kooperberg C, Kurtz RC, Liu S, Overvad K, Palli D, Patel AV, Rabe KG, Shu XO, Slimani N, Tobias GS, Trichopoulos D, Van Den Eeden SK, Vineis P, Virtamo J, Wactawski-Wende J, Wolpin BM, Yu H, Yu K, Zeleniuch-Jacquotte A, Chanock SJ, Hoover RN, Hartge P, Kraft P. An absolute risk model to identify individuals at elevated risk for pancreatic cancer in the general population. PLoS One. 2013;8:e72311.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 87]  [Cited by in F6Publishing: 94]  [Article Influence: 8.5]  [Reference Citation Analysis (0)]
44.  Sharma A, Kandlakunta H, Nagpal SJS, Feng Z, Hoos W, Petersen GM, Chari ST. Model to Determine Risk of Pancreatic Cancer in Patients With New-Onset Diabetes. Gastroenterology 2018; 155: 730-739. e3.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 153]  [Cited by in F6Publishing: 176]  [Article Influence: 29.3]  [Reference Citation Analysis (0)]
45.  Hsieh MH, Sun LM, Lin CL, Hsieh MJ, Hsu CY, Kao CH. Development of a prediction model for pancreatic cancer in patients with type 2 diabetes using logistic regression and artificial neural network models. Cancer Manag Res. 2018;10:6317-6324.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 20]  [Cited by in F6Publishing: 28]  [Article Influence: 4.7]  [Reference Citation Analysis (0)]
46.  Zhao D, Weng C. Combining PubMed knowledge and EHR data to develop a weighted bayesian network for pancreatic cancer prediction. J Biomed Inform. 2011;44:859-868.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 80]  [Cited by in F6Publishing: 66]  [Article Influence: 5.1]  [Reference Citation Analysis (0)]
47.  Sanoob M, Madhu A, Ajesh K, Varghese SM. Artificial neural network for diagnosis of pancreatic cancer. I J C I. 2016;5:40-49.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 6]  [Cited by in F6Publishing: 6]  [Article Influence: 0.8]  [Reference Citation Analysis (0)]
48.  Walczak S, Velanovich V. An Evaluation of Artificial Neural Networks in Predicting Pancreatic Cancer Survival. J Gastrointest Surg. 2017;21:1606-1612.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 23]  [Cited by in F6Publishing: 25]  [Article Influence: 3.6]  [Reference Citation Analysis (0)]
49.  Hayward J, Alvarez SA, Ruiz C, Sullivan M, Tseng J, Whalen G. Machine learning of clinical performance in a pancreatic cancer database. Artif Intell Med. 2010;49:187-195.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 24]  [Cited by in F6Publishing: 23]  [Article Influence: 1.6]  [Reference Citation Analysis (0)]
50.  Cheng JY, Abel JT, Balis UGJ, McClintock DS, Pantanowitz L. Challenges in the Development, Deployment, and Regulation of Artificial Intelligence in Anatomic Pathology. Am J Pathol. 2020;.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 14]  [Cited by in F6Publishing: 32]  [Article Influence: 8.0]  [Reference Citation Analysis (0)]
51.  Hanna MG, Parwani A, Sirintrapun SJ. Whole Slide Imaging: Technology and Applications. Adv Anat Pathol. 2020;27:251-259.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 18]  [Cited by in F6Publishing: 45]  [Article Influence: 11.3]  [Reference Citation Analysis (0)]
52.  Cima L, Brunelli M, Parwani A, Girolami I, Ciangherotti A, Riva G, Novelli L, Vanzo F, Sorio A, Cirielli V, Barbareschi M, D'Errico A, Scarpa A, Bovo C, Fraggetta F, Pantanowitz L, Eccher A. Validation of Remote Digital Frozen Sections for Cancer and Transplant Intraoperative Services. J Pathol Inform. 2018;9:34.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 28]  [Cited by in F6Publishing: 28]  [Article Influence: 4.7]  [Reference Citation Analysis (0)]
53.  Hanna MG, Reuter VE, Ardon O, Kim D, Sirintrapun SJ, Schüffler PJ, Busam KJ, Sauter JL, Brogi E, Tan LK, Xu B, Bale T, Agaram NP, Tang LH, Ellenson LH, Philip J, Corsale L, Stamelos E, Friedlander MA, Ntiamoah P, Labasin M, England C, Klimstra DS, Hameed M. Validation of a digital pathology system including remote review during the COVID-19 pandemic. Mod Pathol. 2020;33:2115-2127.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 73]  [Cited by in F6Publishing: 85]  [Article Influence: 21.3]  [Reference Citation Analysis (0)]
54.  Diehl DL. Artificial intelligence applications in endoscopic ultrasound: “The journey of a thousand miles begins with a single step”. Gastrointest Endosc. .  [PubMed]  [DOI]  [Cited in This Article: ]
55.  Yoshida T, Yamashita Y, Kitano M. Endoscopic Ultrasound for Early Diagnosis of Pancreatic Cancer. Diagnostics (Basel). 2019;9.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 14]  [Cited by in F6Publishing: 15]  [Article Influence: 3.0]  [Reference Citation Analysis (0)]
56.  Momeni-Boroujeni A, Yousefi E, Somma J. Computer-assisted cytologic diagnosis in pancreatic FNA: An application of neural networks to image analysis. Cancer Cytopathol. 2017;125:926-933.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 20]  [Cited by in F6Publishing: 32]  [Article Influence: 4.6]  [Reference Citation Analysis (0)]
57.  Sinkala M, Mulder N, Martin D. Machine Learning and Network Analyses Reveal Disease Subtypes of Pancreatic Cancer and their Molecular Characteristics. Sci Rep. 2020;10:1212.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 34]  [Cited by in F6Publishing: 40]  [Article Influence: 10.0]  [Reference Citation Analysis (0)]
58.  Bhasin MK, Ndebele K, Bucur O, Yee EU, Otu HH, Plati J, Bullock A, Gu X, Castan E, Zhang P, Najarian R, Muraru MS, Miksad R, Khosravi-Far R, Libermann TA. Meta-analysis of transcriptome data identifies a novel 5-gene pancreatic adenocarcinoma classifier. Oncotarget. 2016;7:23263-23281.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 32]  [Cited by in F6Publishing: 33]  [Article Influence: 4.1]  [Reference Citation Analysis (0)]
59.  Almeida PP, Cardoso CP, de Freitas LM. PDAC-ANN: an artificial neural network to predict pancreatic ductal adenocarcinoma based on gene expression. BMC Cancer. 2020;20:82.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 28]  [Cited by in F6Publishing: 24]  [Article Influence: 6.0]  [Reference Citation Analysis (0)]
60.  Barretina J, Caponigro G, Stransky N, Venkatesan K, Margolin AA, Kim S, Wilson CJ, Lehár J, Kryukov GV, Sonkin D, Reddy A, Liu M, Murray L, Berger MF, Monahan JE, Morais P, Meltzer J, Korejwa A, Jané-Valbuena J, Mapa FA, Thibault J, Bric-Furlong E, Raman P, Shipway A, Engels IH, Cheng J, Yu GK, Yu J, Aspesi P Jr, de Silva M, Jagtap K, Jones MD, Wang L, Hatton C, Palescandolo E, Gupta S, Mahan S, Sougnez C, Onofrio RC, Liefeld T, MacConaill L, Winckler W, Reich M, Li N, Mesirov JP, Gabriel SB, Getz G, Ardlie K, Chan V, Myer VE, Weber BL, Porter J, Warmuth M, Finan P, Harris JL, Meyerson M, Golub TR, Morrissey MP, Sellers WR, Schlegel R, Garraway LA. The Cancer Cell Line Encyclopedia enables predictive modelling of anticancer drug sensitivity. Nature. 2012;483:603-607.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 5947]  [Cited by in F6Publishing: 5249]  [Article Influence: 437.4]  [Reference Citation Analysis (0)]
61.  Menden MP, Iorio F, Garnett M, McDermott U, Benes CH, Ballester PJ, Saez-Rodriguez J. Machine learning prediction of cancer cell sensitivity to drugs based on genomic and chemical properties. PLoS One. 2013;8:e61318.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 321]  [Cited by in F6Publishing: 271]  [Article Influence: 24.6]  [Reference Citation Analysis (0)]
62.  Chang K, Balachandar N, Lam C, Yi D, Brown J, Beers A, Rosen B, Rubin DL, Kalpathy-Cramer J. Distributed deep learning networks among institutions for medical imaging. J Am Med Inform Assoc. 2018;25:945-954.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 145]  [Cited by in F6Publishing: 134]  [Article Influence: 26.8]  [Reference Citation Analysis (0)]
63.  Sheller MJ, Reina GA, Edwards B, Martin J, Bakas S. Multi-Institutional Deep Learning Modeling Without Sharing Patient Data: A Feasibility Study on Brain Tumor Segmentation. Brainlesion. 2019;11383:92-104.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 91]  [Cited by in F6Publishing: 74]  [Article Influence: 14.8]  [Reference Citation Analysis (0)]
64.  Guinney J, Saez-Rodriguez J. Alternative models for sharing confidential biomedical data. Nat Biotechnol. 2018;36:391-392.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 43]  [Cited by in F6Publishing: 40]  [Article Influence: 6.7]  [Reference Citation Analysis (0)]
65.  Deist TM, Dankers FJWM, Ojha P, Scott Marshall M, Janssen T, Faivre-Finn C, Masciocchi C, Valentini V, Wang J, Chen J, Zhang Z, Spezi E, Button M, Jan Nuyttens J, Vernhout R, van Soest J, Jochems A, Monshouwer R, Bussink J, Price G, Lambin P, Dekker A. Distributed learning on 20 000+ lung cancer patients - The Personal Health Train. Radiother Oncol. 2020;144:189-200.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 59]  [Cited by in F6Publishing: 63]  [Article Influence: 15.8]  [Reference Citation Analysis (0)]
66.  Ford RA, Price W, Nicholson I. Privacy and accountability in black-box medicine. Mich Telecomm & Tech L Rev. 2016;23:1.  [PubMed]  [DOI]  [Cited in This Article: ]