The annual metronome to British public service broadcasting, the Reith Lectures, have begun this year’s tick, with Dr Atul Gawande doing the tocking. A smooth tocker with hints of George W Bush style pulsing rising enunciation, today’s first lecture – the first of four, we should note – asked ‘Why do Doctors Fail?’. Master of the personal anecdote, Gawande told the tale of a certain Baby Walker – not the contraption, but his son – who survived despite being born with an aortic abnormality, while the baby in the next cot with the same condition did not. The answer, by and large, was systems failure: Baby Walker ended up in the right place at the right time, while Baby Maine next door did not: the right place, perhaps, but too late. Walker walked, Maine died.
Being in the right place at the right time also means access to the right people. Hospital admission tests suggested Walker had good oxygen levels and a primary diagnosis of pneumonia. It took an astute paediatrician to spot the oxygen sensor was on the wrong hand – the right monitor, but in the wrong place – and get to the correct diagnosis, and so treatment. On this part of the anecdote, Gawande developed his wider theme: failures caused by ignorance (what we don’t know) and failures caused by ineptitude (not applying what we do know). Jogging alongside, not entirely comfortably, was the ice cube theme: ice cubes are predictable – put any one in the fire, and it melts – whereas hurricanes (and humans) individually are not. Even when we know the general pattern of behaviour, our predictions go awry because each hurricane (human) is unique, a uniqueness aided and indeed abetted by the large number of complex organ systems present in each and every one of us.
Dr No suggests ‘not entirely comfortably’ because, while Dr No may also worship at the altar of individual uniqueness, much of medicine, especially modern evidence based medicine, relies on the fact we are not unique. It relies on the fact that most people are like most other people. If the drug works for people in the clinical trial, then it will work for (most similar – the caveats are necessary, because in the real world, despite the general, there are always exceptions) people outside the trial. Like the ice cubes that melt, most people are predictable, a premise so fundamental to evidence based medicine that without it evidence based medicine becomes a meaningless fantasy.
So why do doctors fail? Gawande’s short answer is we have no black boxes. Because we have no black boxes, no consultation recorders that can be analysed after an adverse event, we don’t know what went wrong, and caused the doctor – if it was the doctor, for the question is somewhat prejudicial – to fail. Even today, medicine is practised in a world of opacity, the sanctity of the consultation the veil that bars all insight, all light, from outside, from shining into the sacred space. There are exceptions – JT posted recently on GPs watching other GPs consultations, and ward rounds and operating theatres are hardly private places – but by and large, most medicine is practised in private.
There are good reasons for this. As well as lofty notions like the right to privacy, there are practical matters. The sanctity, and the confidentiality and opacity that go with it, are there to allow patients to talk as best they can without fear of what ails them, and to give an honest account of their history, and both are crucial to reaching the right diagnosis, and so treatment. For doctors, with transparency comes the growing threat of know-it-all managers, guideline fetishists, ambulance chasing lawyers and, perhaps most alarming of all to individual doctors, the introduction of piranhas if not into the gold-fish bowl itself, then alongside, in the shape of GMC goons given line-of-sight fire through the consulting room window. With transparency comes the blowback of defensive medicine, poor or even bad medicine done not because it is the right thing to do, but to appease the watching controllers.
Not all Hawthorne effects – that people, including doctors, change their behaviour when observed – are bad, but equally there is nothing that guarantees Hawthorne effects will always be positive. Some of Gawande’s best known work has been on pre-flight checklists in medicine. With ‘we have no black boxes’, Gawande introduces another aviation staple into medicine. Both ideas, especially checklists, have merit. But are consulting room black boxes the answer in medicine?
As it happens, aviators have for some decades been wont to peer from time to time through the medical cockpit window, and by and large their reaction has been WTF – why can’t these guys get even the safety basics right? Parallels are drawn between the catastrophic costs of failure in both medicine and flying, and contrasts exposed between respective achievements in improving safety. Others – notably Don Berwick last year in his report on patient safety in the NHS – have borrowed aviation’s no-blame culture concept, and suggested it holds the answer to reducing times when both doctors and the wider NHS fail.
But we must not forget that medicine is not an airliner, nor the consulting room a flight deck. Yes, we have no black boxes, but do we need them? Tinkling away in the back of Dr No’s mind is that familiar refrain ‘Yes! We have no bananas’. Further back, in the internal kaleidoscope of absurdity that is the kernel of his brain, the missing black boxes have turned into missing bananas. Could it be that, if we give the doctors their black boxes, the kaleidoscope of human nature will turn those black boxes into bananas, and so the doctors into monkeys?
Meanwhile, the non-absurd Dr No awaits Dr Gawande’s coming Reith lectures with great interest.
This reminds me of a chapter I read in the book ‘Blink’ by Malcolm Gladwell. It was about a hospital in America that was constantly over-capacity because it dealt with people that had no health insurance. One of their worries was that where someone had a suspected heart attack, that they might not spend enough time to do a proper diagnosis, but on the other hand, if they spent too much time with them, someone else might cop it. The solution was a heart attack decision tree which had the basic serious signs on it. If the patient corresponded with a certain number of those signs, they were kept in. If they didn’t, they were discharged. They found that it worked very well and they were able to sort out the bad cases from the not so bad ones very quickly.
I think this is the most that training can do; is to ensure that the common, sensible things are done. Some have a diagnostic skill over and above that and also the GP system, where a doctor knows a patient over a number of years can help. I always knew when my dad wasn’t well, because when he was well, he kept up a low level of grumbling. When he wasn’t well, he would go quiet because he didn’t want me to know. I never told him how I always knew he was ill and he never twigged that it was the silence that alerted me..
One of the few things Dr No got out of his time at UMDS (Guy’s and St Thomas’) was an introduction to the concept of wicked problems, and the notion that many medical and public health problems including patient safety share at least some of characteristics of wicked problems. Dreamt up and developed by academics with names so extraordinary they must be real – C West Churchman and Horst Rittel to name but two – wicked problems are almost a wicked problem in and of themselves, in that the concept is hard to pin down (itself a characteristic of wicked problems). The literature lays down at least six, sometimes more, defining characteristics, and the essence usually includes notions such a fluidness, uniqueness, shifting boundaries and worrying tendency towards blowback – somehow the solution either exposes more problems, or even creates more problems. Were one of a sunny disposition – one suspects wicked problem academics are not – one might call wicked problems mischievous problems, in that somehow they are out to get you. The opposite of a wicked problem is sometimes called a tame problem, a common example being simple geometry. If you know the diameter of a circle, you can work out its circumference.
Much of, though of course not all, of flying is based on physics. Without the laws of aerodynamics, planes simply would not fly. Violate those laws – stall the airflow over the wing, for example – and the plane drops out of the sky. Whatever happens, the black box records all, and whatever went wrong can be worked out. Though often complex, the air accident investigation is at its heart more of a tame than a wicked problem.
Medicine on the other hand tends to present wicked problems. The news is scandalously under-filled – perhaps precisely because it is such a huge wicked problem, and so difficult for lowest common denominator journalism to encapsulate and report – by the Ebola outbreak, a real wicked problem of catastrophic proportions. Not a few, not even a million, black boxes, are going to ‘fix’ Ebola.
The question in Dr No’s mind is that if Ebola is the wicked problem on a grand scale, could ‘everyday’ consultations that go wrong in medicine be wicked problems on an individual scale? And if they are, will a black box in every consulting room – the technology of transparency – help solve problems, or, in a shift typical of wicked problems, actually makes matters worse?