This article was first published in the May 2016 issue of WIRED magazine. Be the first to read WIRED's articles in print before they're posted online, and get your hands on loads of additional content by subscribing online.
A serious blood infection can kill even young and fit patients very quickly. With a long list of other sick patients on the doctor's mind, a checklist can be life-saving.
For instance, the Sepsis Six is a list of procedures and treatments that, if done in the first hour of infection, can double a patient's chances of survival. The Sepsis Six is used throughout the NHS and is an example of the recent trend in medicine towards checklists to improve patient safety.
Since the aviation industry improved its safety record in the 80s and 90s -in part through the use of checklists by pilots - similar methods have been attempted in healthcare. In 1999, the Institute of Medicine, a non-governmental US body, published a report, To Err is Human, which was the first admission of the death and disease caused by medical error. Since then the focus across all aspects of medicine has been on imitating aviation in reporting errors in an attempt to create the "black box" of healthcare.
In 2001, Peter Pronovost, a specialist in intensive care at Johns Hopkins Hospital in Baltimore in the US, created a five-step checklist for placing a catheter in a large vein of patients. The checklist included simple instructions such as "Wash your hands with soap" and "Wear a sterile mask, hat, gown and gloves". It reduced infection rates from 2.7 per 1,000 patients to zero inside three months, and is estimated to have saved 1,500 lives across the Michigan hospitals in which it was implemented. Since then, checklists have proliferated in every aspect of healthcare.
However, there's a problem with the overuse of checklists. Checklists oversimplify medicine. They might work for carrying out procedures and remembering simple sequences of actions, but not when managing the complexity of a 95-year-old with heart failure and multiple co-morbidities who wishes to die in peace. Patients' psychology and physiology are complex and constantly changing, and we understand too little. They require systems with flexibility and adaptation. Checklists can't simplify the inherent complexity of medicine.
Research into patient safety by Don Berwick, a former administrator of Medicare and Medicaid, argues that healthcare can be split into two systems: ultra-safe medicine, such as anaesthesiology and blood transfusions (which are similar to aviation); and complex adaptive medicine: almost everything else, from general medicine to intensive care. In ultra-safe medicine, activities are well-defined, behaviour is rule-based and the patients are relatively straightforward. Safety is the main priority.
Complex adaptive medicine involves broad areas of expertise. Behaviour is based on physicians' knowledge rather than rules, and patients can vary wildly in their complexity. Production or efficiency is the main priority. The aim is for as much safety as possible while maintaining that productivity.
The first system, much like aviation, constrains the role of the physician and designs the systems to prevent error. The second gives physicians autonomy and treats them like experts of unlimited capability. In doing so, it places unreasonable demands on the individual. We expect doctors to be free from error, despite working in a complex changing world.
We expect them to admit to errors, but still place responsibility firmly on their shoulders. Complex adaptive medicine doesn't take into account the limits of human cognition or the poorly designed systems that cause medical error - it lays the blame on the individual.
The "black box" thinking approach to healthcare works in ultra-safe medicine. However, because it ignores complex adaptive medicine, it is also creating a culture of fear and blame that has permeated the NHS and the US healthcare system. This has led to a fear of failure, scapegoating of whistleblowers, and physicians covering up their errors. Identifying error without addressing the root cause is, to paraphrase James Reason, a key figure in the aviation-safety revolution of the 80s and 90s, like "fighting mosquitoes, but not draining the swamp".
To drain the swamp there are two solutions. Berwick's approach is to transition the rest of medicine from complex adaptive to ultra-safe checklist-able medicine. This is how aviation itself became safer, but it's a slow process requiring massive cultural and organisational change. It may, given the volatile and complex nature of humans, not even be possible at all.
More than just checklists will be required. We can use cognitive and behavioural psychology to design systems that facilitate the right behaviours. For example, designers from the Royal College of Art have been redesigning many aspects of healthcare at St Mary's Hospital in London. Nearly ten per cent of medical errors involve medication; rethinking the design of drug packaging can reduce error by helping clinicians to select the correct drug. Instead of checklists to catch error and omission, medical devices and software can actually prevent error in the first place.
Cosima Gretton is a doctor, digital health consultant and writer for the King's Fund
This article was originally published by WIRED UK