Systematic Reviews
Copyright ©2014 Baishideng Publishing Group Inc. All rights reserved.
World J Gastrointest Endosc. Sep 16, 2014; 6(9): 436-447
Published online Sep 16, 2014. doi: 10.4253/wjge.v6.i9.436
Evaluation of surgical training in the era of simulation
Shazrinizam Shaharan, Paul Neary
Shazrinizam Shaharan, National Surgical Training Centre, Department of Surgical Affairs, Royal College of Surgeons Ireland, Dublin 2, Ireland
Paul Neary, Division Of Colorectal Surgery, Adelaide and Meath incorporating the National Children’s Hospital, Trinity College Dublin, Tallaght, Dublin 24, Ireland
Author contributions: Shaharan S performed the literature search, analysis and wrote the manuscript; Neary P involved in analysis and editing the manuscript.
Correspondence to: Shazrinizam Shaharan, MB, BCh, BAO, BA, National Surgical Training Centre, Department of Surgical Affairs, Royal College of Surgeons Ireland, 121 St Stephen’s Green, Dublin 2, Ireland.
Telephone: +353-1-4022704 Fax: +353-1-4022459
Received: April 6, 2014
Revised: April 30, 2014
Accepted: August 27, 2014
Published online: September 16, 2014


AIM: To assess where we currently stand in relation to simulator-based training within modern surgical training curricula.

METHODS: A systematic literature search was performed in PubMed database using keywords “simulation”, “skills assessment” and “surgery”. The studies retrieved were examined according to the inclusion and exclusion criteria. Time period reviewed was 2000 to 2013. The methodology of skills assessment was examined.

RESULTS: Five hundred and fifteen articles focussed upon simulator based skills assessment. Fifty-two articles were identified that dealt with technical skills assessment in general surgery. Five articles assessed open skills, 37 assessed laparoscopic skills, 4 articles assessed both open and laparoscopic skills and 6 assessed endoscopic skills. Only 12 articles were found to be integrating simulators in the surgical training curricula. Observational assessment tools, in the form of Objective Structured Assessment of Technical Skills (OSATS) dominated the literature.

CONCLUSION: Observational tools such as OSATS remain the top assessment instrument in surgical training especially in open technical skills. Unlike the aviation industry, simulation based assessment has only now begun to cross the threshold of incorporation into mainstream skills training. Over the next decade we expect the promise of simulator-based training to finally take flight and begin an exciting voyage of discovery for surgical trainees.

Key Words: Simulation, Surgical training, Surgery, Training, Objective Structured Assessment of Technical Skills, Observational tool, Surgical skills, Assessment, Skill assessment

Core tip: The nature of surgical training has teetered on the brink of a seismic change in how we can deliver the level of expertise required of a modern surgeon for over a decade. It is evolving from Halstedian’s apprenticeship model towards simulation-based training similar to the aviation industry. Since 2000 there have been approximately 173 studies about validation of simulators as assessment tools. As the technology grows, its translation into real changes in curriculum is still unclear. This review is focused upon where we currently stand in relation to the effective integration of simulation-based skills assessment into modern surgical training curricula.


The nature of surgical training has teetered on the brink of a seismic change in how we can deliver the level of surgical training required of a modern surgeon for over a decade. The demands imposed by a zero complication ethos expected by patients and emphasised by the media has challenged us as surgical educators to continually assess our training paradigms. Traditionally, surgical training has been largely an opportunity-based learning approach based upon an apprenticeship in the operating room (OR). This Halstedian method[1] of surgical training is often exemplified as the “see one, do one, teach one” approach to training. This system which was reliant upon opportunistic encounters particularly of the complex case mix variety remains extremely time dependant. This apprenticeship model resulted in surgical training often being prolonged in order to gain sufficient surgical experience to reach a subjective level of operative experience. In the modern era of surgical training, trainees are continually restricted on the number of hours they can legally work. This may be as low as 48 h per week in Europe[2] or 80 h in North America[3]. These mandated reductions in working hours have been based upon safe guarding both patients and doctors alike in order to decrease potential errors in the health care system. This decrease in hours however will result in a fundamental reduction in the trainees’ opportunity for surgical operating time exposure with “real” patients. As a direct consequence of these challenges, interest in laboratories with formal curricula, specifically designed to teach surgical skills, has increased dramatically[4].

The use of surgical simulators and inanimate bench models for training and assessment has been the centre of attraction among the training bodies around the world for well over a decade. The use of simulation for clinical skills training, assessment and clinical scenario management provides educators the freedom of focused training in more controlled environment without risking the life of any patients. Trainees may also have the chance to practice the skills required of a modern surgeon to proficiency at their own pace. The greatest advantage of virtual reality medical simulation is the opportunity to try and fail without consequence for the patient[5]. The integration of simulation into training programmes would therefore seem the next most intuitive step for the design and implementation of any modern surgical training curriculum.

In tandem with the continued development of surgical skills in training surgeons of equal importance is our ability to assess the candidates’ proficiency in the performance of these very surgical skills that we have taught. Once again the assessment of surgical skills has been largely subjective and onto this horizon surgical simulation may also provide a solution. The objective characterisation of technical skills can be difficult. Technical performance assessment ranges from basic surgical skills such as knot tying and suturing, basic laparoscopic skills and endoscopy to a wide spectrum of evaluations that include performing complex procedures such as laparoscopic cholecystectomy, vessel anastomosis and tendon repair. Assessment can be defined as making a judgement against a predefined reference[6]. As surgical educators, it is important to assess trainees on their progress in surgical skills in order to ensure that they remain safe in the stressful environment of a real operating theatre. It allows the trainers to give a constructive feedback based on their performances and can be used for the award of certification or even credentialing. Despite its importance to surgeons, technical proficiency historically has been poorly evaluated[7]. A good assessment tool must possess reliability, validity, educational impact, acceptability and feasibility[8].

The aim of this review is to determine where we currently stand in relation to the use of simulation in surgical skills assessment within current training curricula. We focused upon the use of simulators in surgical curricula that embraced the concept of creating proficiency profiles using simulators. Technical performance assessment in laparoscopy, endoscopy and open surgical skills were included.


This review encompassed a literature search in PubMed from January 2000 to November 2013. The keywords used to search the database were “simulation”, “skills assessment” and “surgery”. All search result titles and abstracts were reviewed by the authors, SS and PN. Full texts of compatible articles were examined for eligibility of inclusion as agreed by the two authors.

Inclusion criteria

Studies were included if simulators were used in laparoscopic and endoscopic skills assessment following an intervention such as skills training, courses, surgical curriculum and selection process. Also, studies using simulators to assess open technical skills such as knot tying, suturing or a basic open procedure, for example excisions of sebaceous cyst were included.

Exclusion criteria

The review was focused upon the use of simulators in assessment of surgical skills. Studies that aimed at validating their latest simulator alone were excluded. Studies were excluded if the surgical skills are of specific subspecialties such as ophthalmology, urology, gynaecology, cardiothoracic, ear, nose and throat (ENT), neurosurgery, trauma and orthopaedics, as well as non-validated methods, non-technical skills for example cognitive analysis and patient care simulation. Any non-English articles, reviews, conference abstracts, editorial, comments, supplements and case reports were excluded.


The keyword search yielded 515 articles, of which 201 articles were eligible. Following the application of our inclusion and exclusion criteria, there were 52 articles remained that dealt with technical skills assessment in general surgery. These selected articles were divided into 4 categories according to the skills assessed; open skills (Table 1), laparoscopic skill (Table 2), combination of open and laparoscopic skills (Table 3), and endoscopic skills (Table 4). Out of these articles only 12 studies integrated simulators in a surgical curriculum with technical skills being assessed (Table 5). Only 1 study was found using simulators in the selection process into surgical training programme.

Table 1 Study characteristics assessing open surgical skills (n = 5).
Ref.YearNo. of traineesTasksAssessment tool
Acton et al[9]2010157 clerkshipSuturingOSATS
Brydges et al[10]200838 traineesOne-handed knot tyingMotion analysis (ROVIMAS) and GRS
Chipman et al[11]200924 trainees PGY 1Excision of skin lesion and wound closureOSATS
Jensen et al[12]200845 PGY 1-2Excision of skin lesion and bowel anastomosisVideo-based OSATS and FPA (wound closure aesthetic quality and anastomotic leak pressure)
Olson et al[13]201211 internOpen laparotomy and bowel anastomosisOSATS and survey
Table 2 Study characteristics of studies assessing laparoscopic skills (n = 37).
Ref.YearNo. of participantsTasksAssessment tool
Aggarwal et al[14]200720 traineesLaparoscopic cholecystectomyMotion analysis and video-based GRS
Arora et al[15]201125 surgeonsLaparoscopic cholecystectomyOSATS
Bennett et al[16]201170 studentsCamera navigationBox trainer
Botden et al[17]200918 studentsLaparoscopic suturingProMIS™, FPA using 5-point Likert Scale
Buzink et al[18]201225 trainees6 expertsDiagnostic laparoscopy, laparoscopic cholecystectomy and laparoscopic appendicectomyLapMentor
Cope et al[19]200822 interns6 tasks on MIST VRMIST VR
Crochet et al[20]201126 traineesLaparoscopic cholecystectomyVR Simulator
Ganai et al[21]200719 studentsAngled telescope navigationVR Simulator
Grantchar-ov et al[22]200937 residentsBasic laparoscopic taskMIST VR
Heinrich et al[23]200717 experts26 modulesLapMentor, LapSim, ProMIS™, Surgical SIM
Kanumuri et al[24]200816 studentsLaparoscopic suturing and knot tyingVideo-based performance assessment tool on live porcine
Kolozsvari et al[25]201263 residentsFLS tasks1FLS scoring system
Kurashima et al[26]201317 residentsLaparoscopic inguinal hernia repairGOALS-GH
Langelotz et al[27]2005150 surgeonsNavigation, coordination, grasping, cutting and clippingVR simulators
LeBlanc et al[28]201029 surgeonsLaparoscopic sigmoid colectomyProMIS™ simulator, OSATS and operative error
Lehmann et al[29]201236 surgeons2 LapSim tasksLapSim
Lehmann et al[30]2013105 surgeonsLifting and Grasping, Fine dissectionLapSim
Loukas et al[31]201125 traineesAdhesiolysis, bowel suturing, laparoscopic cholecystectomyLapVR
Loukas et al[32]201120 traineesAdhesiolysis, bowel suturing, laparoscopic cholecystectomyLapVR
Loukas et al[33]201244 novicesPeg transfer, cutting, knot tyingLapVR and video trainer
Lucas et al[34]200832 studentsLaparoscopic cholecystectomyOSATS
Mansour et al[35]201248 traineesPeg transfer, clippingVR simulators
Munz et al[36]200720 novicesIntracorporeal knot tyingICSAD and checklist
Munz et al[37]200424 novicesCutting a shape on a glove and clipping a rubber tubeMotion analysis and error score
Palter et al[38]201225 residentsLaparoscopic right colectomy (live and simulator)Video-based procedure-specific evaluation tool, modified OSATS global rating scale and LapSim
Palter et al[39]201320 traineesClipping, and lifting and grasping, laparoscopic cholecystectomy (actual OR)Video-based procedure-specific evaluation tool, modified OSATS global rating scale and LapSim
Panait et al[40]201142 applicantsNavigation, coordination, grasping, cutting and clippingLapSim
Rinewalt et al[41]201220 residentsFLS tasksGOALS
Rosenthal et al[42]200620 studentsClip and cut cystic ductXitact LS500 Virtual Patient
Seymour et al[43]200216 traineesLaparoscopic cholecystectomy (OR)Video-based operative error scoring system
Sharma et al[44]201319 traineesLaparoscopic cholecystectomyLAP MentorTM
Stefanidis et al[45]201342 novicesLaparoscopic suturing (OR)GOALS, speed, accuracy and inadvertent injuries
Stelzer et al[46]200923 internsPeg transfer, intracorporeal knot tying in dry lab,MISTELS scoring system
running the bowel, intracorporeal knot tying in live porcine modelVideo-based modified GOALS
Tanoue et al[47]2010194 surgeonsLifting and graspingLapSim
Torkington et al[48]200113 traineesMIST VR tasksICSAD and MIST VR
van Rijssen et al[49]2012162 traineesIntracorporeal knot tyingOSATS and Motion Analysis Parameter (MAP)
Varas et al[50]201225 residentsLaparoscopic jejunojejunostomyOSATS, ICSAD, FPA
Table 3 Study characteristics of studies in assessment of open and laparoscopic skills (n = 4).
Ref.YearNumber of participantsTasksAssessment tool
Beard et al[51]201185 traineesMixed tasks (OR)Procedure-based assessment, OSATS
Fernandez et al[52]201230 PGY 1Knot-tying, suturing, laparoscopic skillsOSATS, computer metric-based performance assessments
Mittal et al[53]201260 residentsBasic skills(knot tying,wound closure, enterotomy,vascular anastomosis) and FLSOSATS and FLS
Parent et al[54]201028 internsWound closure and FLS tasksEssential item checklist, economy of time, global competence, FLS system
Table 4 Characteristics of studies in assessment of endoscopic skills (n = 6).
Ref.YearNumber of participantsTasksAssessment tool
Ende et al[55]201228 residentsOGDSimulator and observation
Götzberger et al[56]201113 traineesNo mention in abstractSimulator (5-point Likert scale)
Haycock et al[57]201036 traineesColonoscopy (simulator and OR)Direct Observation of Procedural Skills and Global Scores sheet
Haycock et al[58]200928 traineesPolypectomy, control of upper GI bleeding and oesophageal dilation and PEG insertionStation-specific checklist and global score
Shirai et al[59]200820 residentsOGD11 items 5-grade scale
Van Sickle et al[60]201141 traineesColonoscopyGI Mentor II and GAGES
Table 5 Characteristics of studies integrating skills assessment tools in a simulation-based curricula and selection process (n =12).
Ref.YearNo. of participantsTasksAssessment tool
Open skills
Chipman et al[11]200924 traineesExcision of skin lesion and wound closureOSATS
Olson et al[13]201211 internsOpen laparotomy, bowel anastomosisOSATS and survey
Laparoscopic skills
Buzink et al[18]201225 traineesDiagnostic laparoscopy, laparoscopic cholecystectomy and laparoscopic appendicectomyLapMentor
6 experts
Palter et al[39]201320 traineesClipping, and lifting and grasping, Laparoscopic cholecystectomy (actual OR)Video-based procedure-specific evaluation tool, modified OSATS global rating scale and LapSim tasks
Panait et al[40]201142 applicantsNavigation, coordination, grasping, cutting and clippingLapSim
Rinewalt et al[41]201220 residentsFLS tasksGOALS
van Rijssen et al[49]2012162 traineesIntracorporeal knot tyingOSATS and Motion Analysis Parameter(MAP)
Varas et al[50]201225 residentsLaparoscopic jejunojejunostomyOSATS, ICSAD, FPA
Open and laparoscopic skills
Fernandez et al[52]201230 PGY 1Knot-tying, suturing, laparoscopic skillsOSATS, computer metric-based performance assessments
Mittal et al[53]201260 residentsBasic skills(knot tying,wound closure, enterotomy, vascular anastomosis), FLS tasks1OSATS and FLS score
Parent et al[54]201028 internsWound closure, FLS tasks1essential item checklist, economy of time, global competence, FLS score
Endoscopic skills
Van Sickle et al[60]201141 traineesColonoscopyGI Mentor II and GAGES

With an increasing emphasis of surgical procedures being undertaken in a minimally invasive approach, it is not unsurprising that the assessment of laparoscopic skills dominate the articles included. This bias is also a result of the reality that laparoscopic skills assessment in a simulator has proved far easier than the assessment of open surgical skills. However, observational-type assessment tools remain the instrument of choice in all the skills, especially when assessing trainees in a real operating theatre (OR).

In the studies identified, 21 employed observational tools, mainly Objective Structured Assessment of Technical Skills (OSATS) as the main scoring system to evaluate their candidates’ technical skills performances in open and laparoscopic skills.

The use of simulators in the assessment of laparoscopic skills was evident in 23 publications. Nineteen studies utilised the objective metrics generated by the simulator only and 3 studies used FLS scoring system. One study[17] combined the objective metrics from the simulator with error or injury scores. A total of 13 studies that assessed laparoscopic skills in simulators were using OSATS or checklist-based tools, solely. Out of these, 2 studies[43,45] assessed trainees in the operating theatre (OR) using video-based observational tools following simulation-based training. Interestingly one study[39] combined the performance score on simulator with performance in the OR. Five studies[14,36,37,48,50] used ICSAD combined with other assessment tools or simulator-generated metrics in both open and laparoscopy.

Table 5 outlines reports that incorporated simulators as part of the course in their curriculum. Two of them were for open surgical skills, 6 studies were for laparoscopic skills, 3 studies were for both open and laparoscopic skills and only 1 for endoscopic skills assessment.

One study[40] used virtual reality laparoscopy simulator to assess general surgical applicants who were shortlisted for the residency interview. However, the scores were not used in ranking the candidates for acceptance into the training programme.


Simulation in surgery has been a hot topic among surgical educators for more than a decade. In the early millennium, there was an avalanche of studies using simulators that focused on validating the simulators and proving their reliability and fidelity. Since the year 2000 approximately 173 studies were published that specifically reported construct validity of a wide spectrum of surgical simulators. Many new technologies evolved to progressively improve the existing simulators to higher fidelity systems. However despite the plethora of validation studies being completed over a decade ago there is a glaring hiatus in the literature when one examines the results of the integration of these simulators into surgical training curricula. In particular, there is a lack of study showing the implementation of these simulators in the surgical training institutions across the globe, especially in the arena of surgical skills assessment for credentialing. From our review only 12 studies could be identified from the five hundred triaged that have integrated simulation into a surgical training curriculum. There were 52 studies that used simulators in surgical skills assessment within general surgery. The size of these studies was quite modest with 34 having less than 40 candidates and only 5 having greater than 100 candidates.

The main purpose of having simulators in the surgical training arena is for the acquisition of technical skills appropriate to the level of training. This may be undertaken in a safe training environment both from the trainees and patients’ viewpoint. Simulation-based surgical training is important in teaching the surgical trainees and to monitor their progress along the training programmes until they possess the essential technical skills without risking patients’ lives. In order to grasp this, continuous training and assessment is paramount. Traditionally, trainees’ surgical skills are being assessed by examining the logbook and supervisor feedback after certain amount of time in the service. However it is clear that a logbook records experience and is not a marker of expertise[61]. It contains the number of procedures and supervision code, rather than performance scores for a particular procedure. Therefore, logbooks lack content validity[62]. Supervisor feedback assesses the overall performance of a particular trainee and is not exclusively on the technical skills. It is largely subjective and influenced by multiple factors such as patients’ condition, theatre environment and hospital condition. Therefore, the need for a more robust assessment tool which is objective, reliable and feasible[63] remains.

In our institution surgical simulators are used as part of the initial selection process and thereafter for skills assessment and ongoing training. Irish surgical trainees are required to attend simulation-based operative skills classes throughout their training programme. Apart from the didactic teachings, practical sessions are provided which allow the trainees to practice their skills in open surgery, laparoscopy and endoscopy. Basic surgical trainees are assessed at the end of their training years. Trainees who underperform are required to attend a remedial day where their performances will be discussed with the faculty. For the past 6 years, all candidates shortlisted for Higher Surgical Training (HST) programme in general surgery, cardiothoracic and plastic surgery are required to go through surgical skills assessments prior to their interviews. Their scores carry 10% marks in their overall markings. Gallagher et al[64] showed that four out of five top performers on technical skills stations during selection of higher surgical trainee in general surgery were in the top-ranked applicants overall and subsequently succeeded in being selected into the HST programme. In plastic surgery, Carroll et al[65] proved those applicants selected for HST performed better in all six tasks (laceration repair, Z-plasty, lipoma excision, sebaceous cyst excision, tendon repair and arterial anastomosis) than those who were not.

OSATS remains the selected assessment tool of choice in the evaluation of surgical skills. In our own training programme it is used for all open surgical procedures with inanimate bench models such as bowel anastomosis, excision of lipoma or sebaceous cyst and laparotomy incision and closure. Each station is assessed by an expert surgeon relative to the specialty and all stations are run simultaneously within a time frame. For laparoscopic skills, OSATS assessment is combined with performance on ProMIS™ laparoscopic simulator (Haptica, Dublin, Ireland). The tasks for laparoscopic skills generally include object positioning and sharp dissection. Promis™ simulators score the trainees or candidates according to the total path length, smoothness, time and error. In general surgery and cardiothoracic skills assessment, the GI Mentor endoscopy simulator (Simbionix, Cleveland, OH, United States) and a 15-item checklist are used to assess candidates’ endoscopic skills. GI Mentor could provide time and the percentage of mucosa visualised as objective score in the assessment.

From this review, we identified that the main instruments utilised in practice remain observational tools for both open and laparoscopy. This is despite a myriad of validated computer-based simulators being available in laparoscopy. The most commonly used observational tool is the Objective Structured Assessment of Technical Skills or OSATS. It consists of 2 sets of evaluation checklist; operation-specific checklist and global rating scales. It is consistent with the format of the typical Objective Structured Clinical Examination (OSCE) in which examinees perform a series of clinical tasks at each of several time-limited stations[66]. In another study[41], a different type of observer-dependant assessment tool was used for assessing laparoscopic skills called Global Operative Assessment of Laparoscopic Skills (GOALS). It was developed by a group of researchers[67] in Quebec, Canada. This consists of a checklist and 2 visual analogue scales (VAS). All these observational tools require a minimum of two independent assessors in order to avoid bias in scoring the candidates by single assessor. Therefore, a group of expert surgeons should be recruited to use these assessment tools. This could be done either live during the assessment or by video recordings. Since multiple assessors are required to make these tools valuable, there should be a minimum discrepancy between the scores among the assessors. Otherwise, the scores can be open to critique. In order to prove the degree of agreement among the assessors, inter-rater (IR) reliability is used. IR value should be at 0.8, which means the assessors are in agreement in 80% of the scores but in disagreement in the rest of 20%. A high value of IR reliability indicates that the scores are homogenous and the assessment tool is both robust and of value. In one of the study[13], IR reliability was 0.67 which reflects significant differences of opinion of assessors in the subjective data they are evaluating. This emphasises the weakness of this scoring system, as well as the labour intensive nature of the scoring system. In all these studies the candidates could feel appropriately aggrieved if the arbitrators of success in any task undertaken demonstrated significant difference in opinion as evidenced by such a low IR reliability score. We would contend that the use of a truly objective assessment via simulation in real time must inherently be a stronger approach to assessment.

As with every technology there are a variety of simulators available on the market that has been used in surgical skills assessment. In laparoscopic training and assessment, computer-based simulators are able to provide objective metrics after completion of a laparoscopic task. Some examples of validated virtual reality (VR) simulators available in laparoscopy are MIST VR, LapSim, LapMentor and Xitact LS500[68]. These simulators are able to assess various laparoscopic skills such as camera navigation, object positioning and manipulation, intracorporeal suturing and sharp dissection. However, the main criticism on VR simulators is that they lack of real life representation such as delayed gravity effect and no haptic feedback, as found in LapSim[36].

A hybrid simulator, ProMISTM (Haptica, Dublin) used 100% VR for certain tasks and augmented reality that overlays graphics onto a task performed on a physical exercise[69]. It provided the tactile feedback which is lacking in most VR simulators. VR and hybrid simulators are able to quantify skills in terms of path length, smoothness, economy of movement and time. The simulators also are able to identify the errors performed specific to the procedures and include them in the final report. Various studies have shown their validity and reliability[70-76]. However, these simulators are largely used for learning and practising the skills but rarely used as an assessment tool. Only 56% of the studies in this review employed simulator-generated objective metrics in the laparoscopic skills assessment, either exclusively or combined with other assessment tools.

Endoscopic skills also can be trained and assessed using simulators. Training in endoscopy in a virtual environment is thought to be a good alternative to classical bedside teaching, but without its adverse effects, such as patient discomfort, risk of perforation, and longer examination time[77]. GI Mentor (Symbionix, Israel) is one of the commonest endoscopy simulators used in surgical training institution. After the performance of a case on the simulator, the trainee is presented with an evaluation of performance such as time taken, percentage of mucosa visualized, and percentage of time spent without clear vision (red-out)[78]. Recently, Society of American Gastrointestinal and Endoscopic Surgeons (SAGES) developed Fundamental of Endoscopic SurgeryTM (FESTM) as a training and assessment tool for basic skills in endoscopy[79].

There are fundamental differences in the skills required for laparoscopic surgery as compared to open surgery[80]. Without doubt it is clear from the literature that the use of simulators in open surgery represents a challenge. In general the progression of simulator development has tended to target minimally invasive surgery (MIS)[75]. Nonetheless, open surgery remains to be the paramount procedures across surgical specialties. It is vital to teach surgical trainees and assess their skills in open surgery during their training years. Inanimate bench models such as the laparotomy model from Simulab Corporation (Seattle, WA), skin pads and saphenofemoral junction model from Limbs and Things (Bristol, United Kingdom) are amongst most commonly used in training and assessment. Animal models, either cadaveric or live, have been used in some studies but plagued by ethical issues in regards to animal rights. In United Kingdom, the use of live animals is not permitted under the current law, unlike in Europe, United States and other countries[81]. Martin et al[82] showed bench top simulations gave equivalent results to the use of live animals.

The challenge for the assessment of open surgical skills is to decide what parameters should be evaluated. The role of simulators in the assessment of open surgery however may lie in the determination of a surgeon’s dexterity. The objective measurement of a surgeon’s technical skill or level of dexterity has proved to be very difficult. Surprisingly only 1 study combined OSATS with motion analysis system in an attempt to capture the essence of dexterity[10]. The technology behind the measurement of dexterity in surgery and in particular open surgery is however slowly evolving. The researchers in Imperial College London developed a motion tracking system called Imperial College Surgical Assessment Device (ICSAD). This is a combination of a commercially available electromagnetic tracking system (Isotrak II, Polhemus Inc,Colchester, VT) and bespoke computer software program[83]. It measures the time taken, path length and number of movements in open and laparoscopy skills assessment. This has been shown that the measurements were able to discriminate different level of surgical experience in laparoscopy[48] and open surgical procedures[84]. Then, RObotics VIdeo and Motion Assessment Software (ROVIMAS) replaced the former ICSAD motion analysis software and integrated an improved version of data acquisition module including real-time synchronized motion-video capture functionality[85]. Despite these technologies being now over a decade old it remains largely a research tool rather than incorporated into main stream curricula.

The measurement of dexterity alone is insufficient without it being in the appropriate context. In essence, dexterity may be independent of the quality of the end result. This represents surgical context. Errors such as slip knot and incorrect suture placement could cause horrendous morbidity towards patients. It is the appreciation of these errors that underpins the concept of placing skills assessment and the associated metrics in the correct context.

The majority of assessing errors or analysing the end product is observational. A crude assessment of the quality of the final product is by using a 5-point scale[86]. Scott et al[87] formulated a proficiency score which include a series of errors observed for knot tying and suturing skills and maximum allowable task duration as cutoff time. The formula used was as follow: Score = (cutoff time) - (completion time) - 10 (sum of errors); a higher score indicates superior performance[86]. A significant weight was given to the sum of errors showing the importance of the end-product quality in surgical skills assessment. Patel et al[88] developed low-fidelity exercises for basic skills training and assessment and proved its validity. The exercises were needle driving, knot tying, two-hand coordination and fine motor coordination. The metrics measured include time, accuracy and number of targets completed for needle driving exercise or number of appropriate knots for knot tying exercise. Again, this is open to bias and labour intensive. In practice, the quality of knots is easily tested by spreading the loop until they are either break or slip. However, this is hardly performed with a standardised force by surgical educators. Several studies have used tensiometers to assess the quality of the knots[89-92]. Brydges et al[93] developed a measurement for wound closure skill performance called ‘absolute symmetry error’, which measure related to the “bite size” on each suture placement. It does not require an expert assessor and feasible for self-training and assessment. A few studies assessed the end product of bowel anastomosis by measuring the leak pressure[12,50]. These studies combined the validated assessment tool with final product analysis (FPA). By combining these components in the assessment of skills, trainees’ appraisal is thought to be more accurate and apparent. From our review of the literature, only 2 studies[17,28] combined virtual reality simulator generating metrics combined with error scoring systems in their assessments. This approach would seem sensible when one is considering surgical skills assessment.

There is a vast quantity of published data available underpinning the validity of surgical simulators. However, it was abundantly evident from our review that only a small number of papers have outlined their use as part of a training curriculum. It should be noted that the literature search was restricted to English language publications only. In total, twelve studies were identified that incorporated simulation-based training in the curriculum. The participants in these studies went through various periods of time in training using simulators and their performances in technical skills were assessed at the end of the training phase. OSATS or other observational tools were used to assess open skills in 5 studies. For laparoscopic skills, only 2 of these articles used simulators alone in the assessment and 3 studies combined the simulator score with observational assessment tools. Only 1 study[50] assessed trainees’ performances using multi-modal assessment tools which were OSATS, ICSAD and final product analysis (FPA) (leakage and permeability of an anastomosis). In another study[60], endoscopic skills were assessed by a combination of the simulator-based scores and Global Assessment of Gastrointestinal Endoscopic Skills (GAGES) scores. Two studies developed intensive boot camp session for new residents in order to boost their basic technical skills at the start of their training programme[52,54]. Both studies assessed open technical skills using observational tools and for laparoscopic skills, one study[52] used computer-generated metrics and the other study[54] used FLS scoring system. Fernandez et al[52] proved that the new residents’ performances improved after the 9-wk intensive course. However, in the other study[54] the boot camp course ran for only 3 d and the performances did not show any significant difference compared to the control group. Interestingly, only 1 study[38] assessed trainees performances in a simulation lab and thereafter in OR. After training to proficiency with the simulators, the trainees were required to perform laparoscopic cholecystectomy with their supervisors and the performances were video-recorded. These recordings were then assessed using observational tools. This was the only study that seemed to report active integration of simulator based surgical skill training and translation into real time clinical practice.

It is clear that the assessment of surgical skills in simulation laboratories is robust. The critical question is whether the skills acquired from simulation-based curriculum are transferrable to real operations. The most recent systematic review by Buckley et al[94] demonstrated that simulation-based training has a positive impact on operative time and predefined performance scores in the OR but not the quantifiable measures such as ergonomics, hand dominance and smoothness of movement as measured by simulators. The fundamental assumption of simulation-based training is that the skills acquired in simulated settings are directly transferable to the operative setting[95]. If this assumption is proven to be true, simulation-based curriculum must be one of the main pillars in creating top-quality surgeons which in turn would guarantee an excellent patient care and safety.

Over the last decade, observational assessment tools, such as OSATS, remain the most used methodology to assess surgical skills. It has been over a decade since motion tracking systems were reported as effective tracking tools in assessing surgical skills[96]. Despite the advancement in simulation technology, this available technology has not been fully incorporated into surgical training curricula. This is particularly true for the assessment of open skills. One must therefore query why this is the case. We initially had a frenzy of validation studies since the turn of the millennium in relation to simulators. Following this, technology has only improved in terms of fidelity and reproducibility. The dearth of information in the literature regarding the efficacy of the use of simulators in training programmes may be related to the paucity of data on translating simulator based training into the real patient setting. Yet the conversion from VR to OR as coined by Professor Anthony Gallagher[97] perhaps is finally beginning to get traction. In the past 14 years there have been 12 articles that report their experience of simulators within their general surgical training programmes. One of these has now translated this VR training into OR in practice.

The integration of surgical skills assessment as part of the selection process for Higher Surgical Training (HST) selection in the Irish National Training Programme is a further example of the potential that simulation holds for the surgical training community. One can only hope that over the next decade, now that the validity of simulator based training has finally being accepted, the future of simulation-based surgical training will no longer stand on the precipice but finally take flight.


The traditional apprenticeship model for surgical training as described by William Halstead is reliant on opportunity in order to gain sufficient surgical skills. In the current climate, surgical training is focusing on integrating simulators in the formal curricula, including surgical skills assessment of the trainees.

Research frontiers

For the past 14 years, there is a plethora of published studies that involved validation of various simulators. However, the integration of these simulators in surgical skills assessment as an objective measurement is still minimal.

Innovations and breakthroughs

The authors identified that observer-dependant tool remains largely a tool of choice when assessing both laparoscopy and open technical skills. Some of the studies outlined the use of simulators in objective assessment of laparoscopic skills and minimum amount of studies showed the application of non-observer dependant tool in the assessment of endoscopic and open surgical skills.


The assessment of surgical skills using simulators is highly applicable in surgical curriculum. The next step is to engage the simulation technology in the assessment of technical skills in a real operative setting.


Observational tool: Checklist-based assessment tool used by surgical experts; OSATS: Objective Structured Assessment of Technical Skills.

Peer review

This is a very nice review of the available literature on the results of simulation training on surgical residents.


P- Reviewer: Leitman M, Soreide JA S- Editor: Ji FF L- Editor: A E- Editor: Zhang DN

1.  Cameron JL. William Stewart Halsted. Our surgical heritage. Ann Surg. 1997;225:445-458.  [PubMed]  [DOI]
2.  European Union. Employment Rights and Work Organisation.  Available from: Accessed on Nov 25, 2013.  [PubMed]  [DOI]
3.  Accreditation Council for Graduate Medical Education. Common Program Requirements.  Available from: http:// Accessed on Nov 25, 2013.  [PubMed]  [DOI]
4.  Reznick RK, MacRae H. Teaching surgical skills--changes in the wind. N Engl J Med. 2006;355:2664-2669.  [PubMed]  [DOI]
5.  Satava RM. Accomplishments and challenges of surgical simulation. Surg Endosc. 2001;15:232-241.  [PubMed]  [DOI]
6.  Beard JD. Assessment of surgical skills of trainees in the UK. Ann R Coll Surg Engl. 2008;90:282-285.  [PubMed]  [DOI]
7.  Ahlberg G, Enochsson L, Gallagher AG, Hedman L, Hogman C, McClusky DA, Ramel S, Smith CD, Arvidsson D. Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. Am J Surg. 2007;193:797-804.  [PubMed]  [DOI]
8.  Schuwirth L, van der Vleuten C. Merging views on assessment. Med Educ. 2004;38:1208-1210.  [PubMed]  [DOI]
9.  Acton RD, Chipman JG, Gilkeson J, Schmitz CC. Synthesis versus imitation: evaluation of a medical student simulation curriculum via Objective Structured Assessment of Technical Skill. J Surg Educ. 2010;67:173-178.  [PubMed]  [DOI]
10.  Brydges R, Kurahashi A, Brümmer V, Satterthwaite L, Classen R, Dubrowski A. Developing criteria for proficiency-based training of surgical technical skills using simulation: changes in performances as a function of training year. J Am Coll Surg. 2008;206:205-211.  [PubMed]  [DOI]
11.  Chipman JG, Schmitz CC. Using objective structured assessment of technical skills to evaluate a basic skills simulation curriculum for first-year surgical residents. J Am Coll Surg. 2009;209:364-370.e2.  [PubMed]  [DOI]
12.  Jensen AR, Wright AS, McIntyre LK, Levy AE, Foy HM, Anastakis DJ, Pellegrini CA, Horvath KD. Laboratory-based instruction for skin closure and bowel anastomosis for surgical residents. Arch Surg. 2008;143:852-88; discussion 852-88;.  [PubMed]  [DOI]
13.  Olson TP, Becker YT, McDonald R, Gould J. A simulation-based curriculum can be used to teach open intestinal anastomosis. J Surg Res. 2012;172:53-58.  [PubMed]  [DOI]
14.  Aggarwal R, Ward J, Balasundaram I, Sains P, Athanasiou T, Darzi A. Proving the effectiveness of virtual reality simulation for training in laparoscopic surgery. Ann Surg. 2007;246:771-779.  [PubMed]  [DOI]
15.  Arora S, Miskovic D, Hull L, Moorthy K, Aggarwal R, Johannsson H, Gautama S, Kneebone R, Sevdalis N. Self vs expert assessment of technical and non-technical skills in high fidelity simulation. Am J Surg. 2011;202:500-506.  [PubMed]  [DOI]
16.  Bennett A, Birch DW, Menzes C, Vizhul A, Karmali S. Assessment of medical student laparoscopic camera skills and the impact of formal camera training. Am J Surg. 2011;201:655-659.  [PubMed]  [DOI]
17.  Botden SM, de Hingh IH, Jakimowicz JJ. Suturing training in Augmented Reality: gaining proficiency in suturing skills faster. Surg Endosc. 2009;23:2131-2137.  [PubMed]  [DOI]
18.  Buzink S, Soltes M, Radonak J, Fingerhut A, Hanna G, Jakimowicz J. Laparoscopic Surgical Skills programme: preliminary evaluation of Grade I Level 1 courses by trainees. Wideochir Inne Tech Malo Inwazyjne. 2012;7:188-192.  [PubMed]  [DOI]
19.  Cope DH, Fenton-Lee D. Assessment of laparoscopic psychomotor skills in interns using the MIST Virtual Reality Simulator: a prerequisite for those considering surgical training? ANZ J Surg. 2008;78:291-296.  [PubMed]  [DOI]
20.  Crochet P, Aggarwal R, Dubb SS, Ziprin P, Rajaretnam N, Grantcharov T, Ericsson KA, Darzi A. Deliberate practice on a virtual reality laparoscopic simulator enhances the quality of surgical technical skills. Ann Surg. 2011;253:1216-1222.  [PubMed]  [DOI]
21.  Ganai S, Donroe JA, St Louis MR, Lewis GM, Seymour NE. Virtual-reality training improves angled telescope skills in novice laparoscopists. Am J Surg. 2007;193:260-265.  [PubMed]  [DOI]
22.  Grantcharov TP, Funch-Jensen P. Can everyone achieve proficiency with the laparoscopic technique? Learning curve patterns in technical skills acquisition. Am J Surg. 2009;197:447-449.  [PubMed]  [DOI]
23.  Heinrichs WL, Lukoff B, Youngblood P, Dev P, Shavelson R, Hasson HM, Satava RM, McDougall EM, Wetter PA. Criterion-based training with surgical simulators: proficiency of experienced surgeons. JSLS. 2007;11:273-302.  [PubMed]  [DOI]
24.  Kanumuri P, Ganai S, Wohaibi EM, Bush RW, Grow DR, Seymour NE. Virtual reality and computer-enhanced training devices equally improve laparoscopic surgical skill in novices. JSLS. 2008;12:219-226.  [PubMed]  [DOI]
25.  Kolozsvari NO, Kaneva P, Vassiliou MC, Fried GM, Feldman LS. New dog, new tricks: trends in performance on the Fundamentals of Laparoscopic Surgery simulator for incoming surgery residents. Surg Endosc. 2012;26:68-71.  [PubMed]  [DOI]
26.  Kurashima Y, Feldman LS, Kaneva PA, Fried GM, Bergman S, Demyttenaere SV, Li C, Vassiliou MC. Simulation-based training improves the operative performance of totally extraperitoneal (TEP) laparoscopic inguinal hernia repair: a prospective randomized controlled trial. Surg Endosc. 2014;28:783-788.  [PubMed]  [DOI]
27.  Langelotz C, Kilian M, Paul C, Schwenk W. LapSim virtual reality laparoscopic simulator reflects clinical experience in German surgeons. Langenbecks Arch Surg. 2005;390:534-537.  [PubMed]  [DOI]
28.  Leblanc F, Delaney CP, Neary PC, Rose J, Augestad KM, Senagore AJ, Ellis CN, Champagne BJ. Assessment of comparative skills between hand-assisted and straight laparoscopic colorectal training on an augmented reality simulator. Dis Colon Rectum. 2010;53:1323-1327.  [PubMed]  [DOI]
29.  Lehmann KS, Gröne J, Lauscher JC, Ritz JP, Holmer C, Pohlen U, Buhr HJ. [Simulation training in surgical education - application of virtual reality laparoscopic simulators in a surgical skills course]. Zentralbl Chir. 2012;137:130-137.  [PubMed]  [DOI]
30.  Lehmann KS, Holmer C, Gillen S, Gröne J, Zurbuchen U, Ritz JP, Buhr HJ. Suitability of a virtual reality simulator for laparoscopic skills assessment in a surgical training course. Int J Colorectal Dis. 2013;28:563-571.  [PubMed]  [DOI]
31.  Loukas C, Nikiteas N, Kanakis M, Georgiou E. The contribution of simulation training in enhancing key components of laparoscopic competence. Am Surg. 2011;77:708-715.  [PubMed]  [DOI]
32.  Loukas C, Nikiteas N, Kanakis M, Georgiou E. Deconstructing laparoscopic competence in a virtual reality simulation environment. Surgery. 2011;149:750-760.  [PubMed]  [DOI]
33.  Loukas C, Nikiteas N, Schizas D, Lahanas V, Georgiou E. A head-to-head comparison between virtual reality and physical reality simulation training for basic skills acquisition. Surg Endosc. 2012;26:2550-2558.  [PubMed]  [DOI]
34.  Lucas S, Tuncel A, Bensalah K, Zeltser I, Jenkins A, Pearle M, Cadeddu J. Virtual reality training improves simulated laparoscopic surgery performance in laparoscopy naïve medical students. J Endourol. 2008;22:1047-1051.  [PubMed]  [DOI]
35.  Mansour S, Din N, Ratnasingham K, Irukulla S, Vasilikostas G, Reddy M, Wan A. Objective assessment of the core laparoscopic skills course. Minim Invasive Surg. 2012;2012:379625.  [PubMed]  [DOI]
36.  Munz Y, Almoudaris AM, Moorthy K, Dosis A, Liddle AD, Darzi AW. Curriculum-based solo virtual reality training for laparoscopic intracorporeal knot tying: objective assessment of the transfer of skill from virtual reality to reality. Am J Surg. 2007;193:774-783.  [PubMed]  [DOI]
37.  Munz Y, Kumar BD, Moorthy K, Bann S, Darzi A. Laparoscopic virtual reality and box trainers: is one superior to the other? Surg Endosc. 2004;18:485-494.  [PubMed]  [DOI]
38.  Palter VN, Grantcharov TP. Development and validation of a comprehensive curriculum to teach an advanced minimally invasive procedure: a randomized controlled trial. Ann Surg. 2012;256:25-32.  [PubMed]  [DOI]
39.  Palter VN, Orzech N, Reznick RK, Grantcharov TP. Validation of a structured training and assessment curriculum for technical skill acquisition in minimally invasive surgery: a randomized controlled trial. Ann Surg. 2013;257:224-230.  [PubMed]  [DOI]
40.  Panait L, Larios JM, Brenes RA, Fancher TT, Ajemian MS, Dudrick SJ, Sanchez JA. Surgical skills assessment of applicants to general surgery residency. J Surg Res. 2011;170:189-194.  [PubMed]  [DOI]
41.  Rinewalt D, Du H, Velasco JM. Evaluation of a novel laparoscopic simulation laboratory curriculum. Surgery. 2012;152:550-554; discussion 550-554.  [PubMed]  [DOI]
42.  Rosenthal R, Gantert WA, Scheidegger D, Oertli D. Can skills assessment on a virtual reality trainer predict a surgical trainee’s talent in laparoscopic surgery? Surg Endosc. 2006;20:1286-1290.  [PubMed]  [DOI]
43.  Seymour NE, Gallagher AG, Roman SA, O’Brien MK, Bansal VK, Andersen DK, Satava RM. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg. 2002;236:458-463; discussion 463-464.  [PubMed]  [DOI]
44.  Sharma M, Macafee D, Horgan AF. Basic laparoscopic skills training using fresh frozen cadaver: a randomized controlled trial. Am J Surg. 2013;206:23-31.  [PubMed]  [DOI]
45.  Stefanidis D, Yonce TC, Korndorffer JR, Phillips R, Coker A. Does the incorporation of motion metrics into the existing FLS metrics lead to improved skill acquisition on simulators? A single blinded, randomized controlled trial. Ann Surg. 2013;258:46-52.  [PubMed]  [DOI]
46.  Stelzer MK, Abdel MP, Sloan MP, Gould JC. Dry lab practice leads to improved laparoscopic performance in the operating room. J Surg Res. 2009;154:163-166.  [PubMed]  [DOI]
47.  Tanoue K, Uemura M, Kenmotsu H, Ieiri S, Konishi K, Ohuchida K, Onimaru M, Nagao Y, Kumashiro R, Tomikawa M. Skills assessment using a virtual reality simulator, LapSim, after training to develop fundamental skills for endoscopic surgery. Minim Invasive Ther Allied Technol. 2010;19:24-29.  [PubMed]  [DOI]
48.  Torkington J, Smith SG, Rees B, Darzi A. The role of the basic surgical skills course in the acquisition and retention of laparoscopic skill. Surg Endosc. 2001;15:1071-1075.  [PubMed]  [DOI]
49.  van Rijssen LB, van Empel PJ, Huirne JA, Bonjer HJ, Cuesta MA, Meijerink WJ. [Simulation-based training in minimally invasive surgery: the Advanced Suturing Course]. Ned Tijdschr Geneeskd. 2012;156:A4036.  [PubMed]  [DOI]
50.  Varas J, Mejía R, Riquelme A, Maluenda F, Buckel E, Salinas J, Martínez J, Aggarwal R, Jarufe N, Boza C. Significant transfer of surgical skills obtained with an advanced laparoscopic training program to a laparoscopic jejunojejunostomy in a live porcine model: feasibility of learning advanced laparoscopy in a general surgery residency. Surg Endosc. 2012;26:3486-3494.  [PubMed]  [DOI]
51.  Beard JD, Marriott J, Purdie H, Crossley J. Assessing the surgical skills of trainees in the operating theatre: a prospective observational study of the methodology. Health Technol Assess. 2011;15:i-xxi, 1-162.  [PubMed]  [DOI]
52.  Fernandez GL, Page DW, Coe NP, Lee PC, Patterson LA, Skylizard L, St Louis M, Amaral MH, Wait RB, Seymour NE. Boot cAMP: educational outcomes after 4 successive years of preparatory simulation-based training at onset of internship. J Surg Educ. 2012;69:242-248.  [PubMed]  [DOI]
53.  Mittal MK, Dumon KR, Edelson PK, Acero NM, Hashimoto D, Danzer E, Selvan B, Resnick AS, Morris JB, Williams NN. Successful implementation of the american college of surgeons/association of program directors in surgery surgical skills curriculum via a 4-week consecutive simulation rotation. Simul Healthc. 2012;7:147-154.  [PubMed]  [DOI]
54.  Parent RJ, Plerhoples TA, Long EE, Zimmer DM, Teshome M, Mohr CJ, Ly DP, Hernandez-Boussard T, Curet MJ, Dutta S. Early, intermediate, and late effects of a surgical skills “boot camp” on an objective structured assessment of technical skills: a randomized controlled study. J Am Coll Surg. 2010;210:984-989.  [PubMed]  [DOI]
55.  Ende A, Zopf Y, Konturek P, Naegel A, Hahn EG, Matthes K, Maiss J. Strategies for training in diagnostic upper endoscopy: a prospective, randomized trial. Gastrointest Endosc. 2012;75:254-260.  [PubMed]  [DOI]
56.  Götzberger M, Rösch T, Geisenhof S, Gülberg V, Schmitt W, Niemann G, Kopp VM, Faiss S, Heldwein W, Fischer MR. Effectiveness of a novel endoscopy training concept. Endoscopy. 2011;43:802-807.  [PubMed]  [DOI]
57.  Haycock A, Koch AD, Familiari P, van Delft F, Dekker E, Petruzziello L, Haringsma J, Thomas-Gibson S. Training and transfer of colonoscopy skills: a multinational, randomized, blinded, controlled trial of simulator versus bedside training. Gastrointest Endosc. 2010;71:298-307.  [PubMed]  [DOI]
58.  Haycock AV, Youd P, Bassett P, Saunders BP, Tekkis P, Thomas-Gibson S. Simulator training improves practical skills in therapeutic GI endoscopy: results from a randomized, blinded, controlled study. Gastrointest Endosc. 2009;70:835-845.  [PubMed]  [DOI]
59.  Shirai Y, Yoshida T, Shiraishi R, Okamoto T, Nakamura H, Harada T, Nishikawa J, Sakaida I. Prospective randomized study on the use of a computer-based endoscopic simulator for training in esophagogastroduodenoscopy. J Gastroenterol Hepatol. 2008;23:1046-1050.  [PubMed]  [DOI]
60.  Van Sickle KR, Buck L, Willis R, Mangram A, Truitt MS, Shabahang M, Thomas S, Trombetta L, Dunkin B, Scott D. A multicenter, simulation-based skills training collaborative using shared GI Mentor II systems: results from the Texas Association of Surgical Skills Laboratories (TASSL) flexible endoscopy curriculum. Surg Endosc. 2011;25:2980-2986.  [PubMed]  [DOI]
61.  Paisley AM, Baldwin PJ, Paterson-Brown S. Validity of surgical simulation for the assessment of operative skill. Br J Surg. 2001;88:1525-1532.  [PubMed]  [DOI]
62.  Cuschieri A, Francis N, Crosby J, Hanna GB. What do master surgeons think of surgical competence and revalidation? Am J Surg. 2001;182:110-116.  [PubMed]  [DOI]
63.  Shah J, Darzi A. Surgical skills assessment: an ongoing debate. BJU Int. 2001;88:655-660.  [PubMed]  [DOI]
64.  Gallagher AG, Neary P, Gillen P, Lane B, Whelan A, Tanner WA, Traynor O. Novel method for assessment and selection of trainees for higher surgical training in general surgery. ANZ J Surg. 2008;78:282-290.  [PubMed]  [DOI]
65.  Carroll SM, Kennedy AM, Traynor O, Gallagher AG. Objective assessment of surgical performance and its impact on a national selection programme of candidates for higher surgical training in plastic surgery. J Plast Reconstr Aesthet Surg. 2009;62:1543-1549.  [PubMed]  [DOI]
66.  Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg. 1997;173:226-230.  [PubMed]  [DOI]
67.  Vassiliou MC, Feldman LS, Andrew CG, Bergman S, Leffondré K, Stanbridge D, Fried GM. A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg. 2005;190:107-113.  [PubMed]  [DOI]
68.  Schijven , Jakimowicz . Simulators, first experiences. Minim Invasive Ther Allied Technol. 2003;12:151-154.  [PubMed]  [DOI]
69.  Buckley CE, Nugent E, Ryan D, Neary P.  Virtual reality – A new era in surgical training. In: Eichenberg C. Virtual Reality in Psychological, Medical and Pedagogical Applications. Intech 2012;  Available from:  [PubMed]  [DOI]
70.  Duffy AJ, Hogle NJ, McCarthy H, Lew JI, Egan A, Christos P, Fowler DL. Construct validity for the LAPSIM laparoscopic surgical simulator. Surg Endosc. 2005;19:401-405.  [PubMed]  [DOI]
71.  van Dongen KW, Tournoij E, van der Zee DC, Schijven MP, Broeders IA. Construct validity of the LapSim: can the LapSim virtual reality simulator distinguish between novices and experts? Surg Endosc. 2007;21:1413-1417.  [PubMed]  [DOI]
72.  Zhang A, Hünerbein M, Dai Y, Schlag PM, Beller S. Construct validity testing of a laparoscopic surgery simulator (Lap Mentor): evaluation of surgical skill with a virtual laparoscopic training simulator. Surg Endosc. 2008;22:1440-1444.  [PubMed]  [DOI]
73.  Andreatta PB, Woodrum DT, Birkmeyer JD, Yellamanchilli RK, Doherty GM, Gauger PG, Minter RM. Laparoscopic skills are improved with LapMentor training: results of a randomized, double-blinded study. Ann Surg. 2006;243:854-860; discussion 860-863.  [PubMed]  [DOI]
74.  Maithel S, Sierra R, Korndorffer J, Neumann P, Dawson S, Callery M, Jones D, Scott D. Construct and face validity of MIST-VR, Endotower, and CELTS: are we ready for skills assessment using simulators? Surg Endosc. 2006;20:104-112.  [PubMed]  [DOI]
75.  Neary PC, Boyle E, Delaney CP, Senagore AJ, Keane FB, Gallagher AG. Construct validation of a novel hybrid virtual-reality simulator for training and assessing laparoscopic colectomy; results from the first course for experienced senior laparoscopic surgeons. Surg Endosc. 2008;22:2301-2309.  [PubMed]  [DOI]
76.  Gilliam AD. Construct validity of the ProMIS laparoscopic simulator. Surg Endosc. 2009;23:1150.  [PubMed]  [DOI]
77.  Grantcharov TP, Carstensen L, Schulze S. Objective assessment of gastrointestinal endoscopy skills using a virtual reality simulator. JSLS. 2005;9:130-133.  [PubMed]  [DOI]
78.  Moorthy K, Munz Y, Jiwanji M, Bann S, Chang A, Darzi A. Validity and reliability of a virtual reality upper gastrointestinal simulator and cross validation using structured assessment of individual performance with video playback. Surg Endosc. 2004;18:328-333.  [PubMed]  [DOI]
79.  Poulose BK, Vassiliou MC, Dunkin BJ, Mellinger JD, Fanelli RD, Martinez JM, Hazey JW, Sillin LF, Delaney CP, Velanovich V. Fundamentals of Endoscopic Surgery cognitive examination: development and validity evidence. Surg Endosc. 2014;28:631-638.  [PubMed]  [DOI]
80.  Delaney CP, Neary P, Heriot AG, Senagore AJ.  Operative Techniques in Laparoscopic Colorectal Surgery. 2nd ed. Philadelphia: Wolters Kluwer Health 2013; 1-2.  [PubMed]  [DOI]
81.  Sarker SK, Patel B. Simulation and surgical training. Int J Clin Pract. 2007;61:2120-2125.  [PubMed]  [DOI]
82.  Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84:273-278.  [PubMed]  [DOI]
83.  Datta V, Mackay S, Mandalia M, Darzi A. The use of electromagnetic motion tracking analysis to objectively measure open surgical skill in the laboratory-based model. J Am Coll Surg. 2001;193:479-485.  [PubMed]  [DOI]
84.  Bann S, Kwok KF, Lo CY, Darzi A, Wong J. Objective assessment of technical skills of surgical trainees in Hong Kong. Br J Surg. 2003;90:1294-1299.  [PubMed]  [DOI]
85.  Dosis A, Bello F, Moorthy K, Munz Y, Gillies D, Darzi A. Real-time synchronization of kinematic and video data for the comprehensive assessment of surgical skills. Medicine Meets Virtual Reality 12. The Netherlands: IOS 2004; 82-88.  [PubMed]  [DOI]
86.  Szalay D, MacRae H, Regehr G, Reznick R. Using operative outcome to assess technical skill. Am J Surg. 2000;180:234-237.  [PubMed]  [DOI]
87.  Scott DJ, Goova MT, Tesfay ST. A cost-effective proficiency-based knot-tying and suturing curriculum for residency programs. J Surg Res. 2007;141:7-15.  [PubMed]  [DOI]
88.  Patel NV, Robbins JM, Shanley CJ. Low-fidelity exercises for basic surgical skills training and assessment. Am J Surg. 2009;197:119-125.  [PubMed]  [DOI]
89.  Batra EK, Taylor PT, Franz DA, Towler MA, Edlich RF. A portable tensiometer for assessing surgeon‘s knot tying technique. Gynecol Oncol. 1993;48:114-118.  [PubMed]  [DOI]
90.  Van Sickle KR, Smith B, McClusky DA, Baghai M, Smith CD, Gallagher AG. Evaluation of a tensiometer to provide objective feedback in knot-tying performance. Am Surg. 2005;71:1018-1023.  [PubMed]  [DOI]
91.  Muffly TM, Danford JM, Iqbal I, Barber MD. Assessment of four tissue models on knot tensile strength. J Surg Educ. 2012;69:13-16.  [PubMed]  [DOI]
92.  Ching SS, Mok CW, Koh YX, Tan SM, Tan YK. Assessment of surgical trainees’ quality of knot-tying. J Surg Educ. 2013;70:48-54.  [PubMed]  [DOI]
93.  Brydges R, Carnahan H, Dubrowski A. Assessing suturing skills in a self-guided learning setting: absolute symmetry error. Adv Health Sci Educ Theory Pract. 2009;14:685-695.  [PubMed]  [DOI]
94.  Buckley CE, Kavanagh DO, Traynor O, Neary PC. Is the skillset obtained in surgical simulation transferable to the operating theatre? Am J Surg. 2014;207:146-157.  [PubMed]  [DOI]
95.  Sturm LP, Windsor JA, Cosman PH, Cregan P, Hewett PJ, Maddern GJ. A systematic review of skills transfer after surgical simulation training. Ann Surg. 2008;248:166-179.  [PubMed]  [DOI]
96.  Datta V, Chang A, Mackay S, Darzi A. The relationship between motion analysis and surgical technical assessments. Am J Surg. 2002;184:70-73.  [PubMed]  [DOI]
97.  Seymour NE. VR to OR: a review of the evidence that virtual reality simulation improves operating room performance. World J Surg. 2008;32:182-188.  [PubMed]  [DOI]