The Making of Surgeons

Surgeons are trained, not born

As I’ve written about before (HERE), the acquisition of skills or knowledge in any field requires vast amounts of practice. This requirement for many hours of practice in order to attain competence, or indeed expertise, holds true across disciplines: athletics, music, chess, writing, and yes…surgery.1 In their article, “The Making of An Expert,” psychologist Anders Ericsson and colleagues state, “Consistently and overwhelmingly, the evidence showed that experts are always made, not born.”1

So surgeons are not born and there is rigorous training to go through. But, what does this training look like? And, how do we optimize this training and practice to create better surgeons? Personally, I have had my share of orthopedic surgeries and as a student physical therapist have worked with patients after various surgeries, and I can’t help but wonder how much the surgeon performing the operation affects the patient’s outcome and chance of success. There is a need for trainee surgeons to integrate research and practice; a need to learn clinical judgment; a need to learn the motor skills necessary for instrument handling, suturing, and many other surgical techniques…but, the question remains: how best to do this?

“See one, do one, teach one”

The transformation of surgical education is in large part credited to William Stewart Halsted, who created the residency program for training surgeons in 1890, which became the standard model for the next 100 years! This program is based on the sequence of: “See one, do one, teach one,” where a resident first observes a skill, then performs it, then is expected to teach it.6 The goal is to gradually increase residents’ participation until they reach near autonomy.6 Surgeon and author, Atul Gawande, in his book, “Complications: A Surgeon’s Notes on an Imperfect Science,” describes eloquently his experience as a resident surgeon working within this paradigm to learn the technique of placing a central line and the uncertainty and mistakes inherent in this learning process.3

More recently, there have been changes to medical residents’ education, both structurally and in method of instruction. Significantly, the ACGME (Accreditation Council for Graduate Medical Education) in 2003 limited resident duty hours to 80 hours per week in an attempt to increase patient safety, whereas previously residents had regularly worked up to 110 hours per week.5,6 Also, the implementation of adult learning principles, including motor learning principles, into systems of medical teaching is becoming more widespread. One example of a new system being used is “Peyton’s four-step approach” (which has demonstrated superior outcomes compared with standard medical education).7

“Peyton’s Four-Step Approach”7

Step 1- “Demonstration”: a skill is demonstrated by the instructor

Step 2- “Talk the trainee through”: a skill is demonstrated and explained by the instructor

Step 3- “Trainee talks the trainer through”: the student explains the skill while the instructor performs what is described

Step 4- “Trainee does”: the student performs the skill

Another updated perspective includes the advice to: “1-See many, learn from each outcome. 2- Do many with supervision, learn from each outcome 3- Teach many with supervision, learn from each outcome.”10 Here, like “Peyton’s four-step approach,” there is a clear homage to Halsted’s initial model with an updated acknowledgement that learning takes repetition (as deliberate practice), recall practice, and reflection.10 Additionally, the importance of opportunities for faculty mentorship as well as medical simulation training to allow for additional practice have been emphasized as keys for current and future successful medical education. 5,6, 10, 12

As Ericsson et al., write about the science of pursuing expertise: “All the superb performers he (Benjamin Bloom, author of seminal book: “Developing Talent in Young People”) investigated had practiced intensively, had studied with devoted teachers, and had been supported enthusiastically by their families throughout their developing years.” (emphasis added is mine)1

Practicing…surgery?!

“Implicit in this model is that we practice on the very patients we are delivering care to.”

-Peter Weinstock

As Peter Weinstock describes in his TED Talk: “Lifelike Simulations That Make Real Life Surgery Safer,” there is a need for alternative practice paradigms (such as medical simulation training) not only in training medical residents for ‘routine’ surgeries but for honing skills and expertise for rare surgeries that may be less frequently performed.12 Simulations provide a way to acquire many more ‘low stakes’ practice repetitions, which may be crucially important as it has been found that increasing volume of practice correlates with improvement in surgeon’s outcomes.6

As Weinstock reiterates: “Medicine may be the last high stakes industry that does not practice prior to game time.” Medical simulation practice can include practice with actors (“standardized patients”), virtual reality programs, or mannequin simulators.6 Weinstock’s description of the elaborate nature of certain “life-like” mannequin simulations includes the use of CT/MRI scanning technology, 3D printers, and “Hollywood special effects” to create the most accurate life-like representations of a specific individual on which to practice a surgery. Showing pictures of this process, Weinstock notes: “you’ll notice here that every hair on the child’s head is reproduced.”12 Weinstock goes on to say that surgeons can in this way practice these surgeries as many times as they need to feel comfortable before doing the operation on the actual person, utilizing the idea of: “Operate twice, cut once”12

Needless to say, I really recommend watching his TED Talk! It is truly inspiring!

Who should hold the scalpel?

Given the need to practice and live through many repetitions, it may seem that surgeons with the most years of experience would always have superior outcomes compared to junior surgeons or residents. However, this is not always the case. As Gawande writes, “Even doctors with great knowledge and technical skill can have mediocre results; more nebulous factors like aggressiveness and diligence and ingenuity can matter enormously.”2 Gawande reminds us that there is a human element to this matter and humans make mistakes no matter the level of experience. Additionally, Gawande attests that, just like in a variety of professions, there are some good doctors, some bad doctors, and a lot of average doctors. He writes, “if the bell curve is a fact, then so is the reality that most doctors are going to be average.”2

Back to the question of experience…do trainees or residents necessarily have worse outcomes than do more experienced surgeons? There is a commonly discussed idea in U.S. hospitals called, “The July Effect,” which posits that during the mass change over from experienced residents to new medical student residents in July, patient care and health outcomes are adversely effected.9,13 One systematic review investigated this supposed effect in the research literature from 1989-2010, and though reporting increased mortality and decreased efficiency during the cohort change over period also reported inconclusive evidence as to patient morbidity or medical errors.13 Indeed, “The July Effect” has not been a replicated effect in all settings and patient mixes and it has been noted that much of the research to date has failed to account for seasonal trends in hospital admissions or patient mix when studying this effect.8,9,13 Also, the research to date is unclear on a cause for this supposed effect; whether it may be due to decreased collective clinical experience, reduced familiarity with the setting, or lack of supervision of residents in new roles.13

Some of the differences found between trainee surgeons and experienced surgeons may not have clinical significance or impact patient outcomes; such as one study where increased operative time for trainee surgeons did not affect outcomes of minimally invasive total hip arthroplasty (THA) surgery.11 In some cases, trainees or doctors with less clinical experience may actually pose greater benefit to patients. One unique study investigated this issue by assessing patient care and outcomes during dates of National Cardiology meetings and found lower mortality rates for certain high-risk patients during meeting dates (when it is assumed that thousands of physicians are at these meetings and residents or doctors with less clinical experience are put in charge of patient care).4 Suffice it to say, at this time there is no consistent data arguing for better outcomes based on surgeon experience across all settings and patient characteristics.

Where do we go from here? Well, because Dr. Gawande’s writing has served as the inspiration for much of my research in this area, I think it is only fitting to end with a quote from his book: “Better: A Surgeon’s Notes on Performance”.2 With the goal of improving patient care and training for surgeons, Gawande writes, “Arriving at meaningful solutions is an inevitably slow and difficult process. Nonetheless, what I saw was: better is possible. It does not take genius. It takes diligence. It takes moral clarity. It takes ingenuity. And above all, it takes a willingness to try.”

 

References:

  1. Ericsson, KA, Prietula, MJ, Cokely, ET. The making of an expert. Harvard Business Review. July-August 2007; 85(7-8):114-21.
  2. Gawande, A. Better: A Surgeon’s Notes on Performance. New York: Metropolitan Books, Henry Holt and Company; 2007.
  3. Gawande, A. Complications: A Surgeons Notes on an Imperfect Science. New York: Metropolitan Books, Henry Holt and Company; 2002.
  4. Jena AB, Prasad V, Goldman DP, Romley J. Mortality and Treatment Patterns Among Patients Hospitalized With Acute Cardiovascular Conditions During Dates of National Cardiology Meetings. JAMA Internal Medicine. 2015;175(2):237.
  5. Kavic MS. Teaching and Training Surgery to the Next Generation of Surgeons. JSLS : Journal of the Society of Laparoendoscopic Surgeons. 2011;15(3):279-281.
  6. Kotsis SV, Chung KC. Application of the “See One, Do One, Teach One” Concept in Surgical Training: Plastic and Reconstructive Surgery. 2013;131(5):1194-1201.
  7. Krautter M, Dittrich R, Safi A, et al. Peyton’s four-step approach: differential effects of single instructional steps on procedural and memory performance- a clarification study. Advances in Medical Education and Practice. May 2015:399-406.
  8. Malik AT, Ali A, Mufarrih SH, Noordin S. Do new trainees pose a threat to the functional outcome of total knee arthroplasty? – The “January/July” effect in a developing South Asian country: A retrospective cohort study. International Journal of Surgery Open. 2017;9:13-18.
  9. Mims LD, Porter M, Simpson KN, Carek PJ. The “July Effect”: A Look at July Medical Admissions in Teaching Hospitals. The Journal of the American Board of Family Medicine. 2017;30(2):189-195.
  10. Rohrich RJ. “See One, Do One, Teach One”: An Old Adage with a New Twist: Plastic and Reconstructive Surgery. 2006;118(1):257-258.
  11. Weber M, Benditz A, Woerner M, Weber D, Grifka J, Renkawitz T. Trainee Surgeons Affect Operative Time but not Outcome in Minimally Invasive Total Hip Arthroplasty. Scientific Reports. 2017;7(1).
  12. Weinstock, Peter (January 2016). Lifelike simulations that make real-life surgery safer. Retrieved from: https://www.ted.com/talks/peter_weinstock_lifelike_simulations_that_make_real_life_surgery_safer#t-1067004
  13. Young JQ, Ranji SR, Wachter RM, Lee CM, Niehaus B, Auerbach AD. “July effect”: impact of the academic year-end changeover on patient outcomes: a systematic review. Ann Intern Med 2011; 155:309-15.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s