On error management: lessons from aviation
BMJ 2000; 320 doi: https://doi.org/10.1136/bmj.320.7237.781 (Published 18 March 2000) Cite this as: BMJ 2000;320:781
All rapid responses
Rapid responses are electronic comments to the editor. They enable our users to debate issues raised in articles published on bmj.com. A rapid response is first posted online. If you need the URL (web address) of an individual response, simply click on the response headline and copy the URL from the browser window. A proportion of responses will, after editing, be published online and in the print journal as letters, which are indexed in PubMed. Rapid responses are not indexed in PubMed and they are not journal articles. The BMJ reserves the right to remove responses which are being wilfully misrepresented as published articles or when it is brought to our attention that a response spreads misinformation.
From March 2022, the word limit for rapid responses will be 600 words not including references and author details. We will no longer post responses that exceed this limit.
The word limit for letters selected from posted responses remains 300 words.
Dear Sir - The article by Helmreich discussing lessons to be learnt
from aviation1 is useful and the principles outlined are mentioned in
several other articles, but they mainly refer to commercial aviation
rather than general (private, small business, aerial photography, medical
services, police work, etc.) aviation (GA). The fatal accident rate in
this group perhaps reflects better the problems, especially psychological,
which affect pilots (often single-handed), when not protected by the vast
machinery of an international flying organisation, cockpit cross checks
etc. The three main causes of death in GA are loss of control (in either
instrument or visual conditions), controlled flight into terrain (flying
into a mountain) or fuel starvation.2 Many loss of control incidents were
due to failure to recognise lack of ability or to being out of current
practice or over-confidence. Controlled flight into terrain incidents
occur in instrument conditions and are usually due to pilots either being
lost, failing to obey the rules for terrain clearance, or both. Most
engine "failures" due to running out of fuel defy belief. Yet in spite of
the fact that each of these groups of error is likely to result in a fatal
outcome for the pilot, let alone passengers, they still occur. Clearly,
the psychological factors involved are complex, but it is unlikely that
any pilot set out on that fateful day with the intention of dying.
Airlines now have rigorous psychological assessment prior to
appointment of a pilot to training and, in light of the GA accident rate,
a section on human factors and performance has been introduced to the
Private Pilot syllabus. It remains to be seen whether this will help to
reduce the human factors involved in GA fatalities, but some will
inevitably still occur.3 However, psychological assessment of doctors
and/or medical students along with training in recognition of personality
types and error prone situations could be of benefit to both practitioners
and patients alike and help prevent such scenarios as related in
Helmreich's article.
Ken McCune, F.R.C.S.(Ed)
Research Fellow and private pilot!
References
1. Helmreich RL. On error management: lessons from aviation. BMJ
2000; 320: 781-5
2. CAP 667 Review of General Aviation Fatal Accidents, CAA, Gatwick
Airport, London, 1998
3. Beaty, D. The Naked Pilot, Airlife Publishing Limited,
Shewsbury, 1995
Competing interests: No competing interests
Editor- I was interested to read the recent papers in the BMJ by both
Helmreich and Gaba concerning the similarities between anaesthesia and
aviation in terms of performance standards of personnel. [1,2] Having
attended both anaesthesia crew resource management simulator training and
aviation crew resource management training in the UK and Australia I can
confirm that the models are indeed very similar.
Furthermore having taken the training into the operating theatre and also
into the air (as part of an aeromedical rescue team) I can also testify as
to the value of such
training, in its application to the working environment for which it is
intended.
The recognition that errors occur and the need to move away from a culture
of blame has been highlighted before in anaesthesia. [3] The confidential
critical incident
reporting system set up by the Royal College of Anaesthetists has gone
someway to recognise the need to mirror such systems in the aviation
industry.
However it has also been noted that extensive professional training, as
undertaken by doctors and experience on the job generally ensure that
errors caused by failures of
understanding are rare and that task overload is not at the root of
mistakes. This is achieved by making some processes relatively automatic
and unconscious. As such most mishaps are caused by errors in carrying out
rather simple tasks that would normally demand little attention. An
implication of this, is that the more experienced operator is more likely
to make such errors.[4]
With the advent of re-certification for hospital doctors and the obvious
implications on clinical governance and given the availability of
anaethesia simulators in Stirling,
Bristol and London surely it is sensible that all anaesthetic staff
regularly undergo this training, as is expected of our counterparts in
the aviation industry.
References
[1] Helmreich RL. On error management: lessons from aviation. BMJ
2000; 320: 781-785.
[2] Gaba DM. Anaesthesiology as a model for patient safety in health
care. BMJ 2000; 320: 785-788.
[3] Allnutt MF. Human factors in accidents. Br J Anaesth 1987; 59:
856-864.
[4] Chappelow J. The psychology of safety. Br J Clin Psychology 1988;
2: 108-125.
Peter J Shirley
Senior Specialist Registrar
Department of Anaesthesia,
Aberdeen Royal Infirmary,
Foresterhill,
Aberdeen,
AB25 2ZN
Competing interests: No competing interests
Reducing Errors by Preventing Legal Sanctions for Error Reporting
I have read Helmreich's book "Culture at Work in Aviation and
Medicine," have several years experience in the operating room, and am
researching medication error reduction for my dissertation.
The error problems inherent with the OR include the inherent culture
that surgeons are trained under. Namely to reject comments made by nurses
and others who are attempting to report a problem, or worse yet to yell at
the reporting person as if to blame them (Helmreich, 1998). This should be
included in both the latent and existing error conditions.
Leadership should come in the form of collaboration rather than the
typical "captain of the ship" mentality so frequently found in the United
States.
As is implied in Helmreich's book, a team approach should exist in
which all those in the OR work as equals, supporting and respecting each
others comments. With the impending nursing shortage, more collaboration
is needed and less abuse in the OR (Helmreich, 1998). Nevertheless other
medical error is related to legal sanctions imposed by licensing boards
and regulatory agencies.
A major barrier to medication administration errors (MAE) continues
to be he lack of state and national legal protections for individuals and
institutions (ISMP, 2000). During 1999, the United States Pharmacopeia
gathered valuable medication error data using an anonymous reporting
system. However, only 56 of 6000 hospitals participated due to fear of
repurcussions (USP, 2000). Until laws are in place to protect individuals
and institutions who report errors, they will continue, thus decreasing
the chance of eliminating the system failures. Of course, there would need
to be exceptions to these protections. For example in the case of
intentional harm or neglect, drug abuse, or when the patient was seriously
harmed. But many of the errors that are not reported have resulted in
little or no harm to patients. Yet nurses do not report due to the fear of
retaliation. Even with some institutions anonymous reporting systems, many
nurses do not trust that their job will not be affected by reporting. That
is why a nationally based voluntary reporting mechanism via encrypted
internet access by individuals is the best way to obtain error
information. In addition, laws should be in place to protect individuals
from discoverability unless there was serious injury, malicious intent or
drug abuse.
References:
Helmreich, R. L., & Ashleigh, C. M. (1998). Culture at work in
aviation and medicine: National, organizational and professional
influences. Aldershot Hants, England: Ashgate Publishing Limited.
Institute for Safe Medication Practices [ISMP](2000). A discussion
paper on adverse event and error reporting in healthcare. Available on the
World Wide Web:
http://www.ismp.org
United States Pharmacopeia [USP](2000. Summary of the 1999
information submitted to medmarx: A national database for hospital
medication error reporting. Available on the World Wide Web:
http://www.usp.org/medmarx
Competing interests: No competing interests