Simulation debriefing

OVERVIEW

Many definitions of simulation debriefing exist (Fanning & Gaba, 2007; Sawyer et al, 2016; Grant et al, 2018). Simulation debriefing can be viewed as:

  • learning conversations between two or more people occurring during or after a simulated event that involves reflection on performance, identification of performance gaps, exploration of the rationale for behaviours, and seeking solutions.

A distinction can be made between debriefing and feedback in clinical education, although the boundaries are not well demarcated (Voyer & Hatala, 2015)

  • In the context of simulation, feedback is viewed as one-way delivery of performance information to simulation participants by a facilitator with the intent to improve future performance (Sawyer et al, 2016). This is also more specifically termed “directive feedback”, which is preferable as other, broader conceptions of feedback view a two-way dialogue as important for its effectiveness.

DEBRIEFING AND EDUCATIONAL THEORIES

The rationale for debriefing is derived from numerous educational theories including:

Debriefing is considered vital to learning from simulation, and for the transfer of learning so that it can be applied to other situations (Rivière et al, 2019). It is even sometimes claimed (tongue in cheek…)  that “simulation is just an excuse to debrief”.

STRUCTURAL ELEMENTS OF THE DEBRIEF

Lederman (1992) identified the common structural elements of a debrief:

  1. Debriefer(s)
  2. Participants to debrief (debriefers and participants may be the same people in a self-guided debrief)
  3. An experience (i.e. the simulation scenario)
  4. The impact of the experience (usually requires relevance to the individual and an emotional activation) 
  5. Recollection (participants recall the experience)
  6. Report (usually verbal)
  7. Time (how much time passes between the event and the debrief affects how the experience is viewed)

DEBRIEFING APPROACHES

Debriefing can be classified, according to timing and facilitation, into 3 categories (Swayer et al, 2016):

  • Facilitator-guide post event debriefing
    • Trained facilitator(s) acts to guide the debrief conversation
    • Facilitator may act as a co-learner or as a subject matter expert
    • Shown to improve individual and team performance in numerous contexts
  • Self-guided post event debriefing
    • Participants debrief themselves
    • Typically involves the use of a cognitive aid or cue card to guide discussion, reflection, and self-assessment
    • Less studied, but some evidence that it improves behavioral skills
  • Facilitator-guided within event debriefing
    • Simulation is interrupted by the facilitator when needed to promote learning
    • May occur as “pause and discuss” prior to continuation of the simulation, or as rapid cycle deliberate practice (RCDP) where the simulation is “rewound” and restarted so that the participants can immediately put information from the debrief into practice. 
    • Evidence exists that it is beneficial for developing technical skills and achieving mastery learning goals. However, compared with post event debriefing may be less effective for long term skill retention and less preferred by learners.

ROLE OF THE DEBRIEFER

Roles of the debriefer in facilitator-guided debriefing include:

  • Facilitator – enables a structured conversation to occur
  • Guide – role models effective learning and conversational strategies; leads the group to useful debriefing outcomes
  • Teacher – provides instruction (e.g. as a subject matter expert (SME))
  • Mediator – reconciles differences between participants
  • Co-learner – learns alongside the participants, especially when not a SME

EXAMPLES OF POSTEVENT FACILITATOR-GUIDED DEBRIEFING MODELS

There are numerous models of simulation debriefing (Fanning & Gaba, 2007; Sawyer et al, 2016). Sawyer et al (2016) categorise different models according to their conversation structure (number of “phases”):

Three phase conversation structures

  • Debriefing with Good Judgement “(frames-orientated debriefing”) (Rudolph et al, 2006)
    • Reaction, Analysis, Summary
  • 3D model (Ziggmont et al, 2011)
    • Defusing, Discovering, Deepening
  • GAS (Phrampus & O’Donnell, 2013)
    • Gather, Analysis, Summary
  • Diamond debrief (Jaye et al, 2015)
    • Description, Analysis, Application

Multiphase conversation structures

  • PEARLS (Eppich & Cheng, 2015; Bajaj et al, 2018)
    • Reaction, Description, Analysis, Summary
  • TeamGAINS (Kolbe et al, 2013)
    • Reaction ,Discuss clinical component, Transfer from simulation to reality, Discuss behavioural skills, Summary, Supervised practice if needed
  • Healthcare Simulation After Action Review (Sawyer & Deering, 2013) 
    • DEBRIEF: Define rules, Explain learning objectives, Benchmark performance, Review expected actions, Identify what happened, Examine why things happened the way they did, Formalize learning
    • bAsed on US Army methodology
  • FAST-PAGE
    • Feelings & Facts, Analysis (Preview or Plus/Delta, Advocacy-Inquiry, Generalise, Explain & educate), Summary, Take homes

There is no evidence that one model is better than any other

  • Whether or not debriefing occurs is likely more important than the model used
  • Different approaches can be used according to the context and the skillset of the debriefer

DEBRIEF PROCESS ELEMENTS

Debrief process elements (techniques, strategies, and tools used to optimise learning) have been identified by Sawyer et al (2016) as being of 3 types:

  • Essential elements
  • Conversational techniques and educational strategies
  • Debriefing adjuncts

Essential elements (Sawyer et al, 2016) are:

  • Psychological safety (able to act without fear of consequences)
  • Debriefing stance or assumption (i.e. treat each other with positive regard – that they are doing their best and want to learn)
  • Establish debriefing rules (e.g. confidentiality, treating the simulation as if it were real, focus on improvement)
  • Shared mental model (so that everyone is aware of the events that took palace in the simulation)
  • Address learning objectives (either pre-determined or learner-generated)
  • Open-ended questions (encourage self-reflection and convey curiosity)
  • Using silence (allows participants to formulate thoughts, process events, reflect, and form considered responses)

Conversational techniques and educational strategies (Sawyer et al, 2016) include:

  • Learner-self assessment (aka “plus/delta”, what went well?, what could be improved?)
  • Directive feedback (one-way advice to the learner on how to close a performance gap)
  • Focused facilitation techniques
    • Advocacy inquiry (used to gain insights into the underlying frames behind observed behaviour; facilitator previews the discussion topic, describe an observation, shares their opinion, and asks what was going on in the participant’s mind at the time)
    • Circular questions (asking a third person to describe the relationship between two other people in their presence)
    • Guided team self-correction (participants are asked to compare their performance against components of a pre-specified model of teamwork work)

Debriefing adjuncts (Sawyer et al, 2016) include:

  • Codebriefer (use of more than one facilitator)
  • Debriefing scripts and cognitive aids (especially useful for novice facilitators or self-guided postevent debriefs)
  • Video review (used to create a shared mental model of events and provide objective evidence of performance; though evidence of additional benefit is lacking (Cheng et al, 2017))
  • Assessment tools (e.g. Non-technical skills for surgeons (NOTSS))

Not all conversational techniques and educational strategies, or debriefing adjuncts, are appropriate to the all 3 of the different categories of debriefing approaches (e.g. a codebriefer is not involved in self-guided post event debriefing).

PEARLS FRAMEWORK

The PEARLS framework is a blended approach to debriefing that can be flexibility applied according to context (Eppich & Cheng, 2015; Bajaj et al, 2018). It features many of the commonalities of other models.

The four phases of PEARLS (Eppich & Cheng, 2015; Bajaj et al, 2018) are:

  • Reaction phase
    • check for initial reactions to allow emotions to be defused before proceeding  (“emotion before cognition”) and identify learner-generated learning objectives
    • “What are your initial reactions?”, “what is at the forefront of your minds?”
  • Description phase
    • outline the key events of the simulation using either a learner-centered (i.e. a participant provides the description) or instructor-centered approach (the facilitator provides the description) to develop a shared mental model of the events that took place
    • Learner-centered approach is more likely appropriate if:
      • More time
      • Learner willingness/ expectation
      • Knowledge of learner’s skill to synthesise
      • Simpler scenario
  • Analysis phase
    • explore what happened, why, and what can be learned
    • Any of the conversational techniques and educational strategies described previously may be used according to:
      • Time
      • Domain
      • Rationale
      • Experience
  • Summary phase
    • Highlight key learning points form the discussion using either a learner-centered (i.e. learners share their “take home” points) or instructor-centered approach (the facilitator summarises the discussion and emphasises key learning points)
    • Useful to allow time for questions before summarising so that the debrief finishes on appropriate points of emphasis

STRATEGIES FOR DIFFICULT DEBRIEFING SITUATIONS

Grant et al (2018) have described different phenotypes of difficult debriefing situations. These include learners who:

  • are quiet or reticent
  • are disengaged or disinterested
  • dominate with poor insight and/or knowledge
  • dominate with good insight and/or knowledge
  • reacts emotionally
  • react with defensiveness

Difficult debriefing situations can be mitigated against using proactive strategies (Grant et al, 2018):

  • Pre-briefing (creating a “safe container”)
  • Environment (privacy, seating, noise)
  • Body language (minimal encourages, nods, smile, lean in and listen, open body posture)
  • Eye contact (sit at eye level, distribute eye contact, low visual dominance ratio)

A variety of reactive strategies/ communication can also be used (in addition to appropriate body language and eye contact), often in sequence, to resolve difficulties when they do occur (Grant et al, 2018):

  • Silence
  • Directive questioning (direct questions to specific individuals to steer conversation, establish broader consensus, or uncover thoughts from less vocal team members)
  • Normalisation (relating behaviors, feelings or attitudes to a societal norm)
  • Validation (acknowledging that learners’ feelings behaviors, or thoughts are acceptable)
  • Generalisation (applying a concept to a different context)
  • Paraphrasing (stating something in your own words)
  • Name the dynamic (explicitly naming maladaptive behaviour and making it a topic of discussion)
  • Broadening (involving others in the discussion)
  • Previewing (signposting the next topic for discussion)

DEBRIEFING ASSESSMENT TOOLS

Standardised tools may be used to assess debriefing and provide feedback on debriefer’s performance (“debrief of the debrief”). Tools include:

  • DASH (Debriefing Assessment for Simulation in Healthcare)
  • OSAD (Observational Structured Assessment of Debriefing)

OTHER INFORMATION

  • Local culture is likely an important factor in how debriefing is conducted (Ulmer et al, 2018). 
  • When simulation occurs in the workplace, there is a bidirectional impact of psychological safety (or lack thereof) between the two (Purdy et al, 2022).
  • Established approaches to simulation debriefing can also be adapted to virtual debriefing, although challenges specific to this format need to be considered (Cheng et al, 2020).
  • Simulated patients (aka standardized patients or patient actors), when involved in a simulation scenario, may also participate in debriefing (Pascucci et al, 2014). In general, they should speak in the first person and add value by sharing their perspective of participant interactions with them.
  • Simulation debriefing also has an important role in translational simulation, where the objective is to directly diagnose problems or to improve patient processess, rather than individual learning (Nickson et al, 2021).
  • Many of the principles of simulation debriefing can also be applied to clinical debriefing, though many of the challenges of debriefing are amplified in this setting (see Clinical debriefing).

REFERENCES AND LINKS

FOAM and web resources

Journal articles and textbooks

  • Bajaj K, Meguerdichian M, Thoma B, Huang S, Eppich W, Cheng A. The PEARLS Healthcare Debriefing Tool. Acad Med. 2018 Feb;93(2):336. doi: 10.1097/ACM.0000000000002035. PMID: 29381495.
  • Cheng A, Eppich W, Grant V, Sherbino J, Zendejas B, Cook DA. Debriefing for technology-enhanced simulation: a systematic review and meta-analysis. Med Educ. 2014 Jul;48(7):657-66. doi: 10.1111/medu.12432. PMID: 24909527.
  • Cheng A, Palaganas J, Eppich W, Rudolph J, Robinson T, Grant V. Co-debriefing for simulation-based education: a primer for facilitators. Simul Healthc. 2015 Apr;10(2):69-75. doi: 10.1097/SIH.0000000000000077. PMID: 25710318.
  • Cheng A, Kolbe M, Grant V, Eller S, Hales R, Symon B, Griswold S, Eppich W. A practical guide to virtual debriefings: communities of inquiry perspective. Adv Simul (Lond). 2020 Aug 12;5:18. doi: 10.1186/s41077-020-00141-1. PMID: 32817805; PMCID: PMC7422458.
  • Eppich W, Cheng A. Promoting Excellence and Reflective Learning in Simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simul Healthc. 2015 Apr;10(2):106-15. doi: 10.1097/SIH.0000000000000072. PMID: 25710312.
  • Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc. 2007 Summer;2(2):115-25. doi: 10.1097/SIH.0b013e3180315539. PMID: 19088616.
  • Grant VJ, Robinson T, Catena H, Eppich W, Cheng A. Difficult debriefing situations: A toolbox for simulation educators. Med Teach. 2018 Jul;40(7):703-712. doi: 10.1080/0142159X.2018.1468558. Epub 2018 May 23. PMID: 29792100.
  • Jaye P, Thomas L, Reedy G. ‘The Diamond’: a structure for simulation debrief. Clin Teach. 2015 Jun;12(3):171-5. doi: 10.1111/tct.12300. PMID: 26009951; PMCID: PMC4497353.
  • Kolbe M, Weiss M, Grote G, Knauth A, Dambach M, Spahn DR, Grande B. TeamGAINS: a tool for structured debriefings for simulation-based team trainings. BMJ Qual Saf. 2013 Jul;22(7):541-53. doi: 10.1136/bmjqs-2012-000917. Epub 2013 Mar 22. PMID: 23525093.
  • Lederman LC. Debriefing: Toward a Systematic Assessment of Theory and Practice. Simulation & Gaming. 1992;23(2):145-160. doi:10.1177/1046878192232003 [abstract]
  • Pascucci RC, Weinstock PH, O’Connor BE, Fancy KM, Meyer EC. Integrating actors into a simulation program: a primer. Simul Healthc. 2014 Apr;9(2):120-6. doi: 10.1097/SIH.0b013e3182a3ded7. PMID: 24096918.
  • Phrampus P, O’Donnell J. Debriefing using a structured and supported approach. In: Levine A, DeMaria S, Schwartz A, Sim A, eds. The Comprehensive Textbook of Healthcare Simulation. 1st ed. New York, NY: Springer; 2013:73Y85.
  • Purdy E, Borchert L, El-Bitar A, Isaacson W, Bills L, Brazil V. Taking simulation out of its “safe container”-exploring the bidirectional impacts of psychological safety and simulation in an emergency department. Adv Simul (Lond). 2022;7(1):5. Published 2022 Feb 5. doi:10.1186/s41077-022-00201-8 [article]
  • Rivière E, Jaffrelot M, Jouquan J, Chiniara G. Debriefing for the Transfer of Learning: The Importance of Context. Acad Med. 2019 Jun;94(6):796-803. doi: 10.1097/ACM.0000000000002612. PMID: 30681450.
  • Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc. 2006 Spring;1(1):49-55. doi: 10.1097/01266021-200600110-00006. PMID: 19088574
  • Rudolph JW, Simon R, Raemer DB, Eppich WJ. Debriefing as formative assessment: closing performance gaps in medical education. Acad Emerg Med. 2008 Nov;15(11):1010-6. doi: 10.1111/j.1553-2712.2008.00248.x. Epub 2008 Oct 20. PMID: 18945231.
  • Sawyer TL, Deering S. Adaptation of the US Army’s After-Action Review for simulation debriefing in healthcare. Simul Healthc. 2013 Dec;8(6):388-97. doi: 10.1097/SIH.0b013e31829ac85c. PMID: 24096913.
  • Sawyer T, Eppich W, Brett-Fleegler M, Grant V, Cheng A. More Than One Way to Debrief: A Critical Review of Healthcare Simulation Debriefing Methods. Simul Healthc. 2016 Jun;11(3):209-17. doi: 10.1097/SIH.0000000000000148. PMID: 27254527.
  • Ulmer FF, Sharara-Chami R, Lakissian Z, Stocker M, Scott E, Dieckmann P. Cultural Prototypes and Differences in Simulation Debriefing. Simul Healthc. 2018 Aug;13(4):239-246. doi: 10.1097/SIH.0000000000000320. PMID: 29672469.
  • Voyer S, Hatala R. Debriefing and feedback: two sides of the same coin? Simul Healthc. 2015 Apr;10(2):67-8. doi: 10.1097/SIH.0000000000000075. PMID: 25710319.
  • Zigmont JJ, Kappus LJ, Sudikoff SN. The 3D model of debriefing: defusing, discovering, and deepening. Semin Perinatol. 2011 Apr;35(2):52-8. doi: 10.1053/j.semperi.2011.01.003. PMID: 21440811.

SMILE

squared

Chris is an Intensivist and ECMO specialist at the Alfred ICU in Melbourne. He is also a Clinical Adjunct Associate Professor at Monash University. He is a co-founder of the Australia and New Zealand Clinician Educator Network (ANZCEN) and is the Lead for the ANZCEN Clinician Educator Incubator programme. He is on the Board of Directors for the Intensive Care Foundation and is a First Part Examiner for the College of Intensive Care Medicine. He is an internationally recognised Clinician Educator with a passion for helping clinicians learn and for improving the clinical performance of individuals and collectives.

After finishing his medical degree at the University of Auckland, he continued post-graduate training in New Zealand as well as Australia’s Northern Territory, Perth and Melbourne. He has completed fellowship training in both intensive care medicine and emergency medicine, as well as post-graduate training in biochemistry, clinical toxicology, clinical epidemiology, and health professional education.

He is actively involved in in using translational simulation to improve patient care and the design of processes and systems at Alfred Health. He coordinates the Alfred ICU’s education and simulation programmes and runs the unit’s education website, INTENSIVE.  He created the ‘Critically Ill Airway’ course and teaches on numerous courses around the world. He is one of the founders of the FOAM movement (Free Open-Access Medical education) and is co-creator of litfl.com, the RAGE podcast, the Resuscitology course, and the SMACC conference.

His one great achievement is being the father of three amazing children.

On Twitter, he is @precordialthump.

| INTENSIVE | RAGE | Resuscitology | SMACC

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.