Why patient safety is such a tough nut to crackBMJ 2011; 342 doi: https://doi.org/10.1136/bmj.d3447 (Published 21 June 2011) Cite this as: BMJ 2011;342:d3447
- Ian P Leistikow, coordinator of patient safety center1,
- Cor J Kalkman, professor in anesthesiology; head of patient safety center1,
- Hans de Bruijn, professor of public administration and management2
- 1Patient Safety Center, University Medical Center Utrecht, Utrecht, Netherlands
- 2Policy, Organisation, Law and Gaming Group, Faculty of Technology, Policy and Management, Delft University of Technology, Delft, Netherlands
- Correspondence to: Ian Leistikow, Dutch Healthcare Inspectorate, PO Box 90137, 5200 MA Den Bosch, Netherlands;
- Accepted 16 May 2011
Patient safety has become a major concern in healthcare worldwide. Of patients admitted to hospitals, 3.7% to 17.7% are inadvertently harmed by the way their healthcare is delivered.1 2 Preventable adverse events lead to a larger annual loss of lives than traffic accidents, AIDS, or breast cancer.3 There is, however, a discrepancy between the gravity of the problem and the frailty of the solutions implemented to date.4 5 6 7 On the one hand there is incrementally increased attention paid to patient safety issues and pressure placed on healthcare professionals, organisations, and regulators to curtail the extent of unintended harm to patients. On the other hand, the pace at which solutions are applied is frustratingly slow and methodologically sound studies looking at the impact of implemented safety solutions have often found results that were ambiguous at best.8
The four challenges of patient safety
The four key challenges in implementing patient safety interventions, based on the literature, conferences, and the authors’ experiences in setting up a hospital-wide patient safety programme in a university medical centre, are: visibility, ambiguity, complexity, and autonomy.
Safety issues in healthcare are often not clearly visible. There is often no discernible distinction between a patient who has died because of a preventable adverse event, and any other deceased patient.
Further, even if a problem is recognised at the level of the individual patient, its magnitude can remain hidden from the healthcare professionals involved. Lucian Leape called this “the tyranny of small numbers”: if 400 medical specialists work in a hospital and 40 patients die each year as a result of unsafe practices, each doctor will experience an avoidable death about once every 10 years.9 Even if this event has a major emotional impact, it may well be seen as a freak occurrence rather than a daily problem.
Research showed that healthcare professionals thought the percentage of adverse events in healthcare was significantly lower than the To Err Is Human report claimed, even though many had personal experiences of these events.10 This shows a discordance between the real extent of the problem and the extent felt by the professionals who could have the largest effect on reducing the problem.
Low visibility makes it hard to create a sense of urgency. Safety thus often loses the battle for attention against more visible problems such as financial deficits, obtaining new technology, or labour shortages. Low visibility also makes it difficult for executives and professionals to justify the allocation of large resources to improving patient safety, not only to external stakeholders, employees, and colleagues, but even to themselves.
The second challenge is the high level of ambiguity. Evidence for a clear cause and effect relation is often inconclusive. Consider this example: in our hospital a nurse inadvertently administered adrenaline (epinephrine) instead of heparin through an intravenous line to a very sick paediatric patient. The patient went into cardiac arrest and was resuscitated successfully but died three days later. What was the cause of death? The half life of adrenaline is far too short for it to have had any effect three days later, and though theoretically the adrenaline overdose could have caused myocardial damage, it could just as well have been the patient’s prior illness that led to death. A mistake with grave consequences for the patient clearly was made, but even here we cannot be certain that the error in fact caused the patient’s death.
Another example of the ambiguous nature of the patient safety problem can be seen in the discussions about the role of systems versus the role of individual behaviour. For a long time bad outcomes were typically attributed to individual so called bad behaviour. Research on morbidity and mortality meetings showed that most adverse outcomes were blamed on an individual professional.11 Since the publication of To Err Is Human more and more healthcare professionals have embraced the concept of system failure as the cause of adverse events. The problem, however, is that both sides are right: almost all adverse events in healthcare can be attributed to both system errors and individual errors, depending on what side of the discussion one stands.
In such complex and tightly coupled organisations as hospitals, practically everything can have an effect on patient safety: from temperature of tap water to logistics of meals, from floor tiles to education and training. Donald Berwick describes the consequence: “A hospital that wanted to make patients truly safer would have to involve almost all departments, support systems, and patterns of activity. Dozens of habitual systems would have to change—rounds, record keeping, meetings, training programs, policy manuals, and review procedures, to name but a few.”12
As a consequence, one can feel overwhelmed by the sheer magnitude of the problem. This can lead to fatalism: that it is useless to make one aspect of care safer, because the problem will only strike somewhere else. It can also lead to heated discussions between professionals and executives about which safety problem is the most urgent. Executives might think compliance with regulations (such as hygiene protocols) is paramount, whereas professionals may want to focus on proper equipment and staffing levels. Both can argue the importance of their issue and both can be right at the same time.
Professional autonomy is conducive to patient safety. Healthcare professionals are responsible for the safety of their patients and feel strongly that way. However, this positive characteristic has a negative aspect: physicians often see their autonomy as an unconditional requirement for delivering good care. Consider the example of a surgical patient on an intensive care ward. The surgeon can be adamant that a certain therapy should be applied for “his” patient, while the intensivist, who feels just as responsible for the patient, is convinced another therapy is more appropriate. Both specialists think their autonomy gives them the right to supersede the other’s order. It is not unusual in these cases that the care for the patient will oscillate depending on which specialist has seen him last. Even when autonomy impedes safety, restrictions on autonomy will not be easily accepted.13
Another aspect of autonomy that frustrates safety is the culturally ingrained reluctance to correct an erring colleague. There are many painful recent examples of dysfunctional physicians continuing their practice for years without correction by their peers, who were well aware of the problem. As in other professional groups, such as judges, autonomy has contributed to a so called non-intervention culture.14
This limited role of peer control concerns senior professionals in particular. In an anonymous patient safety questionnaire sent out to all our physicians and nurses, we asked if the respondent would speak to their superior if the superior behaved in a manner that was clearly unsafe for the patient. Of the 135 residents that returned the questionnaire, 40% stated that they would not. In 1991 Albert Wu found that only 54% of residents had told their supervisor about a serious mistake the resident had made.15 Volpp and Grande later described the negative effect of senior faculty on openness about safety: “the presence of many senior faculty members at these [morbidity and mortality] conferences inhibits frank discussion among house staff of the possible role of errors in bad outcome.”16
Healthcare professionals can experience patient safety interventions proposed by someone outside of their domain, like healthcare executives or professionals from another specialty, as a threat to their autonomy.
Patient safety initiatives should respect the four challenges
Many patient safety initiatives focus on the content of the measures a professional should take, for example, deployment of rapid response teams or implementation of a ventilator bundle to prevent ventilator associated pneumonia. Because of ambiguity, however, an intervention that works in one context will not necessarily work elsewhere. The expertise of nurses, the availability of staff, the relationship between nurses and attending physicians can all influence the added value of a rapid response team. Also, the complexity of patient safety can lead to professionals experiencing the initiative as arbitrary, asking why precisely these elements of the ventilator bundle and not something else? This often leads to professionals retreating behind their autonomy: “let us decide what’s best for our patients.” Although seldom published, we see the effects around us every day: non-adherence to protocols, reluctance to use standardised handover formats, resistance to limited working hours, difficulties engaging physicians in incident reporting.
Ways out of the impasse
The good news is that these problems are not unique. In many other disciplines, professionals are facing the same challenges as described here. The primary strategy is to find a better balance between the content of an intervention and the process in which this content is designed.17 Healthcare professionals should not be forced to adopt predesigned, disputable safety interventions, but should be involved in interactive processes to develop interventions in their own specific context. Once involved, professionals gain influence and become committed to a workable and reasonable outcome. In addition, they acquire knowledge about patient safety that directly or indirectly improves their daily patient care.
These interactive processes might start on a small scale, which is a major advantage, being much easier to initiate than a substantive top down intervention. Once successful, a small scale intervention can have a high impact: peer to peer style dissemination of the insights gained bypasses autonomy issues that so often frustrate top down approaches.
Consider pre-surgical checklists as an example. Despite mounting evidence that they can dramatically decrease complication rates and mortality, merely imposing this intervention is not sufficient.18 19 Rather, professionals should be facilitated to create their own, tailor-made checklists, or given the chance to argue the similar or better effect of an alternative intervention.20
Criteria for success
To be successful, these interactive processes must meet several criteria. Firstly, participants must have influence among their peers, which will be conducive to the dissemination of the results. Secondly, it must be regarded as safe and attractive to participate. This can be achieved by, thirdly, creating enough room during the process for professionals to define what they see as the main problems and solutions. Fourthly, management must be willing to approve the outcome. Finally, participation at the start can be made mandatory, but professionals must have freedom to step out of the process if they think it’s going the wrong way. This might seem to create a risk for the enterprise, but the exit option is often conducive to the willingness to participate at the start of a process. During the process, the exit option will probably not be used: the longer a participant is engaged, the more they have invested and the harder it is to opt out.
The multiple perspectives of professionals on problems and solutions can be discussed in these processes, and can result in improvements that work and are supported by both professionals and management. The professionals’ energy is thus shifted from resisting what may be perceived as external interference to finding ways to tackle the four key challenges and improve patient safety in their own arena.
For example, by having healthcare professionals execute root cause analysis in multidisciplinary teams, safety issues become more visible to them, they are compelled to consider each other’s perspective and they learn to handle the ambiguity and complexity of factors influencing safety, while maintaining their autonomy. The quality of such analyses at the ward level may be lower than that performed by a professional outsider, but it can tremendously engage professionals and empower them to implement multiple sustainable safety interventions.
Engaging professionals in proactive risk analyses
After successfully completing a pilot with proactive risk analysis (analysing the process of administering vincristine on a paediatric ward), the board of the University Medical Center Utrecht (UMC Utrecht) decided that all divisions should conduct at least one proactive risk analysis.
How did the board deal with the potential barriers? Professionals were allowed to select the care processes that they thought might have the highest chance of causing adverse events. Choices included administering blood products, the care process of a patient with a hip fracture from emergency room to operating room, the restraint of agitated patients, and nutrition of patients with metabolic disorders.
Many professionals were not willing to commit up front to the expected time investment of seven 90 minute meetings. They were urged to join the first meeting, but were free to leave any time during the analysis process, which made it easier for them to join. Each team had at least some authoritative professionals with informal power among their fellow professionals. The board would leave the handling of the recommendations that came out of the analyses to the division heads.
During the process, it proved difficult to use the exit option. The professionals were actively analysing care processes that they themselves had selected, and there was some peer pressure from the group to stay. Also, early in the process, the first lessons learnt became visible, which enthused the professionals to continue.
In the period 2005–2006, nine proactive analyses were completed, leading to 96 safety recommendations. Examples include: placing a moving plateau on the counter to prevent blood products from clotting, performing a time out procedure before surgical incision, adding a decision tree to the restraining protocol, and digitising the diet forms. We found that 85% of the recommendations had been implemented, thanks to the way the process was organised.
Proactive analysis has now become standard within the UMC Utrecht.
Cite this as: BMJ 2011;342:d3447
IPL was coordinator of the patient safety programme within the University Medical Center Utrecht (UMC Utrecht), Netherlands, from 2003 to 2011. This is a 1000 bed tertiary care centre with 10 000 employees and an annual budget of over €750m (£659m; $1.1bn). He has recently finished a PhD thesis on leadership and patient safety and is now senior inspector for the Dutch Healthcare Inspectorate. CJK is professor in anaesthesiology and head of the UMC Utrecht Patient Safety Center. HdB is professor at the Faculty of Technology, Policy and Management of the Delft University of Technology, Delft, Netherlands. IPL, CJK, and HdB wrote the article. HdB initiated the vision of the article. IPL is guarantor.
Funding: The organisations that pay our salaries do not mandate open access publication.
Competing interests: All authors have completed the ICMJE uniform disclosure form at www.icmje.org/coi_disclosure.pdf (available on request from the corresponding author) and declare: no support from any organisation for the submitted work; no financial relationships with any organisations that might have an interest in the submitted work in the previous three years; no other relationships or activities that could appear to have influenced the submitted work.
Provenance and peer review: Not commissioned; externally peer reviewed.