Many simulation educators cite debriefing as the hardest part of their job. We know there are lots of ‘wrong’ ways to debrief – we’ve all been there!
In the absence of empiric evidence for the best way to debrief, ‘gurus’ of debriefing have emerged (usually not self-anointed!), each with avid disciples of their techniques / models. The article “More than one way to debrief”1 is a timely contribution to help us realize there are lots of ‘right’ ways. Debriefing models and methods are not in competition, but rather offer a suite of possibilities for the debriefer to use as they are “thinking on their feet”,2 appropriate to the situation.
The key message from the paper for me was to make sure my debriefings have structure, principles and a series of common elements, rather than recipes or overly prescriptive formats.
Breaking that down…
The paper1 starts with providing context for the where our thinking on debriefing is in 2016. For those new to simulation and debriefing literature – this succinctly captures a large volume of theory and practice. I like the discussion about the difference between debriefing and feedback, although perhaps don’t share the view myself. I have always thought good feedback on the floor was also ‘bidirectional and reflective’. (For more on feedback – a lovely blog and recorded talk3 from my @StEmlyns friends, with heaps of resources)
The methods are described as a ‘critical review’. For most readers with limited experience of qualitative research methods this is a nice departure point to reflect on how to best synthesize heterogeneous qualitative literature. The authors justify their approach and explain it in detail, such that it could potentially be replicated. They offer references for those interested in more. Clearly a highly protocolised systematic review just won’t do the job. As the authors say – this is not a yes/ no question but rather a “which debriefing methods are best for which contexts and for whom”.
The authors then structure their review around key topics – the timing, facilitation methods, conversational structures, and process elements used during healthcare simulation.
The timing issue was a bigger chunk of the article than I would have anticipated. I am most familiar with the post event, facilitator guided model, and I liked how this made me challenge that approach as being right for all my debriefing contexts.
The segment on facilitator-guided post event debriefing conversational structure offers wonderful clarity in comparing and contrasting the various models offered – including the 3 phase and multiphase conversational structures. Table 3 underlines the paper title – more than one way to debrief – while also illustrating that many of the differences are not so substantive.
I was not surprised to see the essential elements common across methods – tables 2 and 4. Prioritizing issues like pre-briefing content (psychological safety, debriefing stance), establishing debriefing rules, addressing learning objectives, using silence, conversational techniques, learner self-assessment, and co-debriefers makes intuitive sense. Context is everything when it comes to the exact approach taken. Each element is dealt with succinctly by the authors. The list in Table 4 could provide a structure for post debriefing reflection on performance, a ‘debrief the debrief’, or even a formal assessment of debriefing, informing tools like DASH.4
My only real disappointment was that these clever and insightful authors didn’t elaborate on what they mean by ‘what works’ in their opinion, and what they would consider good measures of ‘works’. They (no doubt correctly) thought it was beyond the scope of this paper. There were clues in the discussion – some of the papers reviewed used measures like learner preference, skills retention and improved behavioral skills.
However I hope it’s a prompt for the next phase of debriefing conversations and research, including on debrief2learn! I’m interested in what others think are the measures of a ‘good debrief’? I personally am not sure learner preference is a great one…?
In summary – great stuff. A paper that should be widely read by simulation educators.