BPG is committed to discovery and dissemination of knowledge
Cited by in F6Publishing
For: Ganni S, Botden SMBI, Chmarra M, Goossens RHM, Jakimowicz JJ. A software-based tool for video motion tracking in the surgical skills assessment landscape. Surg Endosc 2018;32:2994-9. [PMID: 29340824 DOI: 10.1007/s00464-018-6023-5] [Cited by in Crossref: 15] [Cited by in F6Publishing: 15] [Article Influence: 3.8] [Reference Citation Analysis]
Number Citing Articles
1 Stenmark M, Omerbašić E, Magnusson M, Andersson V, Abrahamsson M, Tran PK. Vision-based Tracking of Surgical Motion during Live Open-Heart Surgery. J Surg Res 2021;271:106-16. [PMID: 34879315 DOI: 10.1016/j.jss.2021.10.025] [Cited by in Crossref: 1] [Cited by in F6Publishing: 1] [Article Influence: 1.0] [Reference Citation Analysis]
2 Levin M, McKechnie T, Kruse CC, Aldrich K, Grantcharov TP, Langerman A. Surgical data recording in the operating room: a systematic review of modalities and metrics. Br J Surg 2021;108:613-21. [PMID: 34157080 DOI: 10.1093/bjs/znab016] [Cited by in F6Publishing: 4] [Reference Citation Analysis]
3 Hunt R, Cable J, Ellison A. Shining a light on parasite behaviour: daily patterns of Argulus fish lice. Parasitology 2021;148:850-6. [PMID: 33691819 DOI: 10.1017/S0031182021000445] [Cited by in F6Publishing: 1] [Reference Citation Analysis]
4 Davids J, Makariou SG, Ashrafian H, Darzi A, Marcus HJ, Giannarou S. Automated Vision-Based Microsurgical Skill Analysis in Neurosurgery Using Deep Learning: Development and Preclinical Validation. World Neurosurg 2021;149:e669-86. [PMID: 33588081 DOI: 10.1016/j.wneu.2021.01.117] [Cited by in Crossref: 1] [Cited by in F6Publishing: 5] [Article Influence: 1.0] [Reference Citation Analysis]
5 Beulens AJW, Brinkman WM, Umari P, Koldewijn EL, Hendrikx AJM, van Basten JP, van Merriënboer JJG, van der Poel HG, Bangma C, Wagner C. Identifying the relationship between postoperative urinary continence and residual urethra stump measurements in robot assisted radical prostatectomy patients. Int J Med Robot 2021;17:e2196. [PMID: 33113236 DOI: 10.1002/rcs.2196] [Cited by in F6Publishing: 1] [Reference Citation Analysis]
6 Logishetty K, Gofton WT, Rudran B, Beaulé PE, Cobb JP. Fully Immersive Virtual Reality for Total Hip Arthroplasty: Objective Measurement of Skills and Transfer of Visuospatial Performance After a Competency-Based Simulation Curriculum. J Bone Joint Surg Am 2020;102:e27. [PMID: 31929324 DOI: 10.2106/JBJS.19.00629] [Cited by in Crossref: 12] [Cited by in F6Publishing: 17] [Article Influence: 6.0] [Reference Citation Analysis]
7 Zhenzhu L, Lu L, Zhenzhi L, Xuzhi L, Yizhi L, Gangxian F, Henglu W, Jinke D, Qingbo W, Pengfei L, Meng L, Jianmin L, Zefu L. Feasibility Study of the Low-Cost Motion Tracking System for Assessing Endoscope Holding Skills. World Neurosurgery 2020;140:312-9. [DOI: 10.1016/j.wneu.2020.04.191] [Cited by in Crossref: 1] [Cited by in F6Publishing: 1] [Article Influence: 0.5] [Reference Citation Analysis]
8 Close MF, Mehta CH, Liu Y, Isaac MJ, Costello MS, Kulbarsh KD, Meyer TA. Subjective vs Computerized Assessment of Surgeon Skill Level During Mastoidectomy. Otolaryngol Head Neck Surg 2020;163:1255-7. [PMID: 32600121 DOI: 10.1177/0194599820933882] [Cited by in F6Publishing: 1] [Reference Citation Analysis]
9 Beulens AJW, Namba HF, Brinkman WM, Meijer RP, Koldewijn EL, Hendrikx AJM, van Basten JP, van Merriënboer JJG, Van der Poel HG, Bangma C, Wagner C. Analysis of the video motion tracking system "Kinovea" to assess surgical movements during robot-assisted radical prostatectomy. Int J Med Robot 2020;16:e2090. [PMID: 32034977 DOI: 10.1002/rcs.2090] [Cited by in F6Publishing: 4] [Reference Citation Analysis]
10 Ganni S, Botden SMBI, Chmarra M, Li M, Goossens RHM, Jakimowicz JJ. Validation of Motion Tracking Software for Evaluation of Surgical Performance in Laparoscopic Cholecystectomy. J Med Syst 2020;44:56. [PMID: 31980955 DOI: 10.1007/s10916-020-1525-9] [Cited by in F6Publishing: 4] [Reference Citation Analysis]
11 Andras I, Mazzone E, van Leeuwen FWB, De Naeyer G, van Oosterom MN, Beato S, Buckle T, O'Sullivan S, van Leeuwen PJ, Beulens A, Crisan N, D'Hondt F, Schatteman P, van Der Poel H, Dell'Oglio P, Mottrie A. Artificial intelligence and robotics: a combination that is changing the operating room. World J Urol. 2020;38:2359-2366. [PMID: 31776737 DOI: 10.1007/s00345-019-03037-6] [Cited by in Crossref: 9] [Cited by in F6Publishing: 19] [Article Influence: 3.0] [Reference Citation Analysis]
12 Derathé A, Reche F, Moreau-Gaudry A, Jannin P, Gibaud B, Voros S. Predicting the quality of surgical exposure using spatial and procedural features from laparoscopic videos. Int J Comput Assist Radiol Surg 2020;15:59-67. [PMID: 31673963 DOI: 10.1007/s11548-019-02072-3] [Cited by in Crossref: 4] [Cited by in F6Publishing: 3] [Article Influence: 1.3] [Reference Citation Analysis]
13 Dakua SP, Abinahed J, Zakaria A, Balakrishnan S, Younes G, Navkar N, Al-Ansari A, Zhai X, Bensaali F, Amira A. Moving object tracking in clinical scenarios: application to cardiac surgery and cerebral aneurysm clipping. Int J Comput Assist Radiol Surg 2019;14:2165-76. [PMID: 31309385 DOI: 10.1007/s11548-019-02030-z] [Cited by in Crossref: 3] [Cited by in F6Publishing: 6] [Article Influence: 1.0] [Reference Citation Analysis]
14 Kowalewski K, Garrow CR, Schmidt MW, Benner L, Müller-stich BP, Nickel F. Sensor-based machine learning for workflow detection and as key to detect expert level in laparoscopic suturing and knot-tying. Surg Endosc 2019;33:3732-40. [DOI: 10.1007/s00464-019-06667-4] [Cited by in Crossref: 10] [Cited by in F6Publishing: 16] [Article Influence: 3.3] [Reference Citation Analysis]
15 Baghdadi A, Hussein AA, Ahmed Y, Cavuoto LA, Guru KA. A computer vision technique for automated assessment of surgical performance using surgeons’ console-feed videos. Int J CARS 2019;14:697-707. [DOI: 10.1007/s11548-018-1881-9] [Cited by in Crossref: 9] [Cited by in F6Publishing: 13] [Article Influence: 2.3] [Reference Citation Analysis]