Minireviews
Copyright ©The Author(s) 2020.
World J Gastroenterol. Oct 7, 2020; 26(37): 5606-5616
Published online Oct 7, 2020. doi: 10.3748/wjg.v26.i37.5606
Table 1 Ex vivo studies
Ref.CountryAlgorithmNumber of images/videosOutcomes
Karkanis et al[19], 2003GreeceHand-crafted60 videosSensitivity 94%, specificity 99%
Maroulis et al[34], 2003GreeceHand-crafted2809 video frameAccuracy > 95%
Jerebko et al[35], 2006United StatesHand-crafted56 imagesSensitivity 84%
Hwang et al[21], 2007United StatesHand-crafted8621 video framePer-polyp sensitivity 96%
Park et al[36], 2012United StatesHand-crafted35 videos, > 1 million framesAUROC 0.89
Wang et al[37], 2014United StatesHand-crafted46 video fileSensitivity 81,4%
Bernal et al[38], 2015SpainHand-crafted612 video framePPV 70%
Tajbakhsh et al[39], 2015United StatesHand-crafted19400 video frame (property), 300 video frame in CVC-ColonDBSensitivity on property database 48%, sensitivity in CVC-ColonDB 88%
Wang et al[20], 2015United StatesHand-crafted53 videosPer-polyp senstivity 97.7%
Geetha et al[40], 2016IndiaHand-craftedStill images 703 framesSensitivity 95%, specificity 97%
Fernández-Esparrach et al[22], 2016SpainHand-crafted25 videosSensitivity 70.4%, specificity 72.4%
Angermann et al[41], 2017FranceHand-crafted18 video with 10924 frames100% per-polyp sensitivity PPV 50%
Park et al[42], 2016United StatesCNN562 imagesSensitivity 86%, specificity 85%
Billah et al[43], 2017BangladeshCNN14000 still imagesSensitivity 99%, Specificity 99%
Yu et al[44], 2017ChinaCNN18 videosSensitivity 71%, PPV 88%
Zhang et al[23], 2017ChinaCNN150 random + 30 NBI imagesSensitivity 98%, PPV 99%, AUROC 1, Accuracy 86%
Urban et al[25], 2018United StatesCNNImageNet 1.2 mil, 53588 imges from videos90% sensitivity
Misawa et al[24], 2018JapanCNN135 video clipsPer-polyp sensitivity 94%, per-frame sensitivity 90%, specificity 63.3%, accuracy 76.5%
Pogorelov et al[45], 2018NorwayCNN1359 to 11954 frame from still imagesSensitivity 75%, specificity 94%
Yamada et al[46], 2018JapanCNN4840 video imagesSensitivity 97%, specificity 99%, AUROC 0.975
Zhu et al[47], 2018ChinaCNN616 still imagesSensitivity 89%, 92% classification accuracy
Hassan et al[26], 2019ItalyCNN338 videos, 1.5 milion framesSensitivity per lesion 99.7%
Ahmad et al[48], 2019EnglandCNN24596 video framesSensitivity 85%, specificity 93%
Eelbode et al[49], 2019BelgiumCNN758 frames of still imgesSensitiviy 92%, specificity 85%
Ka-Luen Lui et al[50], 2019ChinaCNN6 unedited videosPer-polyp sensitivity 100%, per frame sensitivity 98.3%, specificity 99.7%, AUROC 0.99
Misawa et al[51], 2019JapanCNN64 videos86% sensitivity
Shichijo et al[52], 2019JapanCNN1233 still imagesPer-polyp sensitivity 100%, per-image semsitivity 99%, 76% PPV
Ozawa et al[53], 2020JapanCNN7077 images92% sensitivity, 86% PPV, accuracy 83%
Table 2 Artificial intelligence system country approval
Artificial intelligence systemCountry
GI-Genius (Medtronic)European Union, Australia, Israel, South Arabia
CAD-Eye (Fuji)European Union
Discovery (Pentax)European Union
Endobrain-EYE (Olympus)Japan
Wision-AIChina
Table 3 In vivo randomized control trials characteristics
Ref.CountryCAD systemCAD system aimNumber of patients
ADR (%)
ADR (%)
WLCADWLCAD
Wang et al[15], 2019ChinaEndoScreenerDetection53652220.328.9
Wang et al[54], 2020ChinaEndoScreenerDetection4784842834.1
Gong et al[30], 2020ChinaENDOANGELQuality318324816
Repici et al[31], 2020ItalyGI-GeniusDetection34434140.454.8
Liu et al[28], 2020ChinaHenan Xuanweitang Medical Information technology Co. Ltd.Detection51850823.939.2
Su et al[29], 2020China-Detection; quality31530816.528.9