According to a recent study, around 350 patients die in Australian hospitals every two weeks. The figure would be expected to be much higher in the UK.
Prof Jeff Richardson, from Monash University, appropriately said, “The issue of adverse events in the Australian health system should dominate all others. However, it would be closer to the truth to describe it as Australia’s best kept secret.”
I have a personal interest in this issue. My father died as a result of a “preventable hospital error.” He was having a routine imaging procedure of his liver and bile ducts and a major artery was hit. The bleeding was not recognised til too late and he bled to death. (The autopsy report claimed he died of a heart attack! The heart eventuyally does stop when there is not enough blood.)
So what is the answer? Current debate is focussed on improving systems. Mandatory reporting of incidents, immunity from prosecution for those who report, etc.
These can all be grouped into external interventions.
But the system is not really the problem. The system exists to control human behaviour. The core problem is inside, in the people who commit or contribute to the preventable errors.
The problem is that we are dealing with fallible, very fallible human beings. But more importantly human beings with significant moral limitations. The problem is primarily an ethical one, about the kind of people we are, our own personal ethical commitments and motivations.
The famous Canadian physician Thomas McRae once said very wisely that in medicine “more is missed by not looking than by not knowing.”
The desire to look, the motivation to care, commitment to perfectionism, or at least doing a good enough job – these are the most important elements in addressing preventable errors.
We need to make doctors and all health care staff better, more moral people. No amount of tweaking the system will address that.
I was asked at dinner last night by one of my students, what the relevance of philosophy and ethics was. Ethics is at the heart of the problem of preventable deaths, the issue of personal ethics. We need to look inside ourselves, not outside, if we are to tackle this problem.
Barbara Sahakian has recently done a study of modafinil improving sleep deprived surgical performance. This kind of study represents the new wave of enhancing human performance. But more important than physical and cognitive performance is probably moral performance, the will to be good.
Human moral enhancement is a current research focus of the Oxford Centre for Neuroethics. I have written a number of articles on moral enhancement:
Julian a nice post, but I'm not sure I agree. Although this depends on whether I understand your claim correctly. You say:
"But the system is not really the problem. The system exists to control human behaviour. The core problem is inside, in the people who commit or contribute to the preventable errors."
I guess this depends on what we think the system is and who is internal and external to it. If you claim is that it is the healthcare workers on the inside and the architects of the system on the outside (government, policy-makers, budget-makers management) then I disagree. On the other hand, if you want to put all of these as being internal to the system then I'll agree.
In any case, the way the system is set up, its structure, its architecture, all impact on the the functioning and ability of those working in hospitals and other health care settings. Even the most moral, or indeed morally educated doctors, nurses, etc, will be tested to the limit and make (avoidable) errors in a system which is time and cash poor and within which people are pressured, stressed, unappreciated, and feel engaged in a constant battle.
Making the working environment more conducive to thinking, compassion, and moral behaviour would go a long way.
With regards to systems errors there is a case (which I think you can find in Alan Merry & Alexander McCall Smith's 'Errors, Medicine, & the Law') where an anaesthetist accidentally injected dopamine instead of doxopram into a convulsing patient. The patient arrested, but was resuscitated and transferred. Afterwards the doctor realised his error and reported it. He was subsequently charged with manslaughter.
What contributed to this doctor's mistake? Was a lack of morals? It wouldn't seem so since he admitted his mistake as soon as he realised it – to his detriment. Was it the fact that a long emergency operation had preceded the episode? That the dopamine ampoule looked exactly like the doxapram ampoule? That there was no nurse available to check the ampoule? Something else. In all likelihood some combination of situational and other factors.
This case in NZ led to a campaign which resulted in systems changes, for example, to the packaging of drugs and making sure that all drugs trolleys in every anaesthetics room are the same with drugs in a particular position.
You are quite right when you suggest that 'tweaking' the system is not enough. In many cases nothing short of wholescale overhaul will do!
Medical error is complex and Julian is right to highlight how common it is, together with the human consequences and the unacceptable way that "the system" can close ranks around errors (vide the "cardiac arrest" comment in the report.)
An additional part of the problem, apart from system failure, individual negligence/fallibility and cover-ups is the way we as a profession set ourselves up for failure by portraying an image of infallibility, most importantly to ourselves, but also to our patients. This creates unrealistic expectations in the community ( which come back to bite all ) and also makes it hard to acknowledge our own fallibility to ourselves, which is fatal in dealing with patients and also in applying the rigour we need in analysis of events to make them less likely the next time.
As a neonatologist my standard pre delivery chat to likely NICU parents goes along these lines.
" Sorry Ms x. but you are about to have a baby of x. weeks' gestation. This means that baby has a y% chance of death and a z% chance of severe disability. If your baby dies it will almost certainly be as a result of a withdrawal of care, a decision in which you will have a major part. You need to start working with us and yourselves about your priorities and beliefs should this eventuate. We will continue to explore this with you daily until the situation clarifies, but you need to look to your supports and prepare yourselves for the worst year of your lives. We are available to help with this and are open to anything possible to help—-
etc etc
Then
You are entering a 6 month relationship with a community of 200+ people, with all of the confusions, fallibilities and communication cock-ups inherent in this. The survival skills you need to cultivate are the ability to get your information from 2 or 3 people over time, rather than 20 or 30, we will help you in this by— and we need your help too. ,
This community of 200 people is made up of fallible people who all try their best and mostly get it right. It is , however , inevitable that mistakes will be made in the care of your baby, mostly small issues of minor error in care or communication , but occasionally major errors. We have systems in place to minimise the impact of such errors and to minimise the risk to your baby. These systems mostly work but they are also human systems and are therefore fallible, so sometimes we do make mistakes which kill or maim babies. We cannot promise that we will not make a mistake which kills or maims your baby, all that we can do is to promise that we will be honest with you if this happens and that we will be doing all we can to minimise this risk."
They commonly look a bit gob-smacked after this.
What then commonly happens is that one of them looks a bit pensive and then pipes up with something like: " mmm. I did put the wrong series of head gasket on that Ford last month and we had to totally strip the engine again after it blew up".
Even god-like doctors are fallible like everybody else and most parents can accept this and use it as a far more realistic and usable basis for an ongoing relationship if given the chance
Infallibility is bad for all sorts of reasons, mostly 'cos we can't deliver it but even more importantly because it is a very destructive frame of mind which makes progress and honesty very difficult. It is very tempting to bask in the glory of the various things we can do, getting caught up in our own magnificence, but it is completely unrealistic and a very bad basis for planning and action, both increasing the risk of error and making it more likely that we will be judged harshly when the inevitable errors occur ( thus producing more defensive behaviours and denial )
There is a need for individual responsibility around errors and clinical decision making, but ritual crucifixion of the 'guilty' is not necessarily the way forward. The modern trend to look at "system failure" has been useful in allowing the freedom to analyse error and reduce recurrence risk, but also carries the risk that the negligent or incompetent can point to this and say "not my fault"
Perhaps we need systems which can cope with the fact that we are all occasionally negligent or incompetent, but we also need an honest discourse with our patients about this. Hard conversation.
Andrew Watkins
Comments are closed.