8 minute read

Black box thinking: changing the surgical mindset

Alisdair Felstead

Alisdair Felstead is currently a Senior Trauma Fellow at The Royal Sussex County Hospital in Brighton, having completed a Foot and Ankle Fellowship at Queen Alexandra Hospital Portsmouth. He completed his specialist training in the KSS region, and will soon be taking up a consultant position in Foot and Ankle Surgery.

The concept of learning from one’s mistakes is not a new one. Many of us will be familiar with the now ubiquitous comparison of the approach to safety by airlines with that of healthcare organisations. The mandated ‘cockpit silence’ for the WHO checklist shows how far we have come in healthcare regarding reducing medical errors by instituting protocols.

Matthew Syed expands upon this idea and introduces the theory of ‘Black Box Thinking’ in his excellent book1. The premise is that each of us can employ a metaphorical ‘black box’, which can be opened after an error and used to influence future practice. The key to opening the box is by supressing some of the natural human traits such as confirmation bias and cognitive dissonance, and by eradicating the blame culture that so pervades the modern-day NHS.

The book begins with the account of the death of Elaine Bromiley during a routine sinus operation. This young mother was a low-risk elective patient who suffered hypoxia under general anaesthetic despite the intervention of several senior anaesthetists. The futility of the interventions, and the lack of perspective which prevented the initiation of a surgical airway, provide the opportunity for significant learning. The fact that her widower is an airline pilot has further amplified the ability of her story to change the way individuals work in healthcare. Syed argues that a ‘no blame’ culture is essential to benefit from the learning that these incidents provide. This is not to say that negligence should be ignored, more that professionals acting in good faith and under pressure should be supported to engage with the learning process, and not castigated or ostracised.

Unfortunately, Syed argues, we have a long way to go in modern healthcare to achieve the safety profile of the airline industry. This problem is not restricted to the NHS, indeed up to 400,000 deaths per year in the USA are attributed to preventable harm. The issue is the divergent approach to failure when comparing the two sectors. After an air accident, the black box will be obtained, and searched for clues as to the causative factors. Key players will be interviewed without prejudice. Sometimes simulations will be run to look at the effect of modifying individual factors. Only once the learning has been put into place will flying resume. Unfortunately, in health care we are crippled by a fear of litigation from the patient, and shame in front of our peers. Despite official protestations, the impression is given that there must always be somebody ‘at fault’, a scapegoat if you like. It has always struck me as odd and disturbing that some staff in the NHS see the obligatory Datix form as an instrument of blame. This completely misses the point of the exercise and obliterates the opportunity for learning. Syed illustrates this concept neatly referencing a Harvard study comparing two hospitals, one where a ‘blame culture’ was endemic and staff lived in fear of reprisal, and one where the environment was open and honest. The reported incidence of mistakes in the former was lower, but the actual measured level of patient harm was higher. The same effect was seen in the ‘Baby P’ case in Haringey, where castigation of the social workers led to resignations, staff shortages and the rate of child homicide increased by 25%. The message is that paradoxically, an open and honest ‘no blame culture’ leads to a safer environment for patients.

As well as the culture of the organisation, Syed explores human traits which tend to lead to ‘closed-loop thinking’. I found his observations enlightening, and easy to extrapolate the theories to one’s own professional and personal life. He spends time discussing cognitive dissonance, which in simple terms means the discord created when the values of the subject clash with the reality of the situation. The human trait is to try and explain away factual data that seem to not to fit with their own beliefs. The example used in the book is one of the Iraq war, and the presence or otherwise of weapons of mass destruction. He is at pains to point out that he is not expressing political opinion, simply stating the facts as they were presented. Tony Blair’s justification for the invasion of Iraq hinged upon there being weapons present, hence the risk to his legacy and by implication, moral standing, if this theory was disproved. Syed states that it is natural that we try and protect the narrative at all costs, often subconsciously, as not to do so incurs significant risk. I believe that we see this within orthopaedic surgery. A poor surgical decision will be defended, because the risk to the surgeon’s moral, professional pride and self-belief is too great to allow it to be proven wrong. How many times have we blamed the patient, colleagues or standard of kit for a less than satisfactory result? Looking inward, reflecting, learning and changing practice is a much healthier way to proceed. Unfortunately, practicing defensively is further entrenched by the way we operate. Too many ‘metalwork meetings’ end up with aggressive personal criticism rather than critical analysis. We are judged on our X-rays, with no thought as to the integrity of the soft tissues or other crucial variables. I once heard it said that the most useful journal that will never be published is ‘The Journal of Failure’. Unfortunately, we are often reduced to reading single surgeon, single centre data regarding a small number of patients who all did very well from a particular procedure. The learning from this is minimal. This notion of cognitive dissonance is referenced in the excellent article by Deepa Bose in the September 2021 issue of the JTO2. She states that the natural reaction to mistakes is to ‘bury one’s head in the sand’, but the healthier way to proceed involves discussion, deconstruction and rationalisation. This process should be automatic after any adverse result in surgery, acknowledging that complications do occur, and crucially learning from them.

The book briefly touches on confirmation bias, as this is inextricably linked to our approach to failure. Syed states that it is a human trait to want to confirm what we believe to be true, but a much more effective tactic is to examine the alternative truth or null hypothesis. In orthopaedics we are often tempted by the potential of success, so much so we design studies specifically to look for it, we selectively analyse our outcomes, often explaining away outliers. Unfortunately, this blunts our ability to pick up when things are going wrong, potentially delaying the modification of practice and harming patients. There is no doubt that the increasing use of PROMS data, level 1 research and joint working help moderate the effect of confirmation bias, yet in orthopaedics we are still hamstrung by surgical dogma. Why analyse unfixed / poorly fixed trimalleolar ankle fractures that did badly, when we can quote the ones that had an ‘acceptable result’? One of the key themes is the ‘paradox of success’. The application of the feedback loop is something that we can use to drive improvement in our practice. The most effective learning occurs when the feedback is instant, and the learning can be applied in real time. Syed uses the example of learning to steer a car versus steering a ship. If we introduce delay or worse still, break the loop so that we don’t gain feedback, then improvement and refinement cannot happen. Numerous examples are quoted from both industry (the Unilever nozzle designed by biologists) and sport (David Beckham taking a free kick). This contrasts markedly with the approach often taken in medicine, which does not utilise evolution, but employs a linear model of research: theory – design – application. Black box thinking is peppered with examples of individuals who are motivated and inspired by ‘failure’. Trevor Bayliss was driven by the lack of batteries in Africa to design his wind-up radio, and James Dyson by the loss of suction of traditional vacuum cleaners. I believe that we can use this to drive improvement in surgical practice. If the theatre list starts late, then there may be small incremental improvements (marginal gains if you like) that we can implement to change the status quo. Sometimes I think we are guilty of trying to redesign services from the ground up, rather than utilising an evolutionary approach. On a personal level, we should all be malleable, constantly refining and improving our practice rather than chopping and changing from one implant or technique to another. A supervisor of mine once told me that the most interesting thing is not what a surgeon does, but why he does it, as this reveals an open and intelligent surgical mindset, rather than one driven by surgical dogma. In the words of George Bernard Shaw: “A life spent in making mistakes is not only more honourable but more useful than a life spent doing nothing.”

References

1. Syed M. (2015). Black box thinking: The surprising truth about success (and why some people never learn from their mistakes). John Murray.

2. Bose D. The orthopaedic ostrich: surgeons’ responses to complications. Journal of Trauma & Orthopaedics 2021;9(3):22-24.