Intended for healthcare professionals

Head To Head

Have we gone too far in translating ideas from aviation to patient safety? Yes

BMJ 2011; 342 doi: (Published 14 January 2011) Cite this as: BMJ 2011;342:c7309
  1. James Rogers, consultant anaesthetist and flying instructor
  1. 1Department of Anaesthesia, Frenchay Hospital, Bristol BS16 1LE, UK

James Rogers thinks that attempts to learn from aviation are ignoring fundamental factors in healthcare, but David Gaba (doi:10.1136/bmj.c7310) argues that much more could be done

Why are doctors constantly told to adopt aviation safety practices? My own specialty of anaesthesia is particularly vulnerable, based on the dubious analogy that giving an anaesthetic is similar to flying an aircraft. Although initiatives such as the World Health Organization’s surgical safety checklist are generally welcome, the aviation model has only a limited place in medicine because there are fundamental differences between the ways in which doctors and pilots work.

Using a checklist should never detract from the priorities of flying an aircraft or looking after a patient safely. Immediate actions should be committed to memory, followed by reference to a concise aide memoire. Crucially, a checklist is distinct from a briefing, which is normally given at two specific times during a flight—before departure and before descent. A briefing deals with all the “what ifs?”(where to divert to in bad weather, what to do if an engine fails on take-off) and deliberately takes place at a time in flight when workload is relatively low. In the operating theatre the checklist and briefing have merged untidily— team introductions, discussions, and concerns are integral to a briefing but shouldn’t feature on a checklist.

A proper checklist prompts a “challenge-response” dialogue that is conducted in a rapid, efficient way. This works well on the flight deck with only two people involved, both of whom are suitably qualified and alternating between flying and non-flying pilot roles. Standard operating procedures have defined these roles clearly, allowing a captain and first officer who have not previously met to fly a route, both confident of each other’s actions and responsibilities. Such standardisation is an unrealistic aspiration in the operating theatre, where there are more people, a varied skill mix, and a constant to-ing and fro-ing. The person reading the checklist often responds to his or her own challenge, thus losing the element of cross checking of items with another team member and negating the value, and there are more likely to be “authority gradients,” discouraging junior team members from questioning their seniors. Crew resource management programmes have enabled airlines to reduce such behaviour on the flight deck, but overbearing personalities are still widespread in medicine.

Value of experience

Emergency drills and checklists don’t take experience into account. Pilots train for events they may never encounter, but doctors deal with emergencies frequently and develop judgment. For example, only a few patients with postoperative airway obstruction require re-intubation, but you need to have seen a fair number to decide which they are. In addition, pilots aren’t often faced with having to diagnose—the computerised monitoring systems will display not only exactly what is wrong but also the relevant actions to take. Even with this degree of automation, human confirmation of a problem at the initial stage is useful—as long as it’s correct. In the Kegworth disaster, the crew declared an engine failure on the right rather than the left and went on to mistakenly shut down the good engine.

Are checklists and emergency drills infallible in aviation? Not necessarily—but that’s usually apparent only with the wisdom of hindsight. For example, in the Concorde disaster at Paris, should the pilot have made a snap decision to abort the take-off even after having passed “V1,” the “must go” airspeed, contrary to established procedures? Would the outcome have been better if his burning aircraft had overshot the runway but come to a halt, rather than taking to the air?

Risk management

The expectation in aviation is that everything should go smoothly; equipment is standardised and pilots fly only aircraft on which they have been trained. Even variables such as weather are dealt with—there are strict minimum conditions that must be met before starting an approach to land. In contrast, ill patients come in all shapes and sizes, and diseases follow different courses, even before allowing for the fickleness of human behaviour. Unsurprisingly, the mindset of pilots and doctors is different—in medicine not only do we tackle situations that are inherently dangerous, such as operating on the moribund patient, but we are also obliged to outline the risk and obtain consent to proceed. This doesn’t happen in commercial aviation; if something—equipment, weather, runway condition—doesn’t meet standards, you simply don’t fly (and you don’t consult the passengers in reaching that decision).

The evolution of simulators in aviation has been financially driven. Yes, it helps that emergency situations can be reproduced and drills practised, but using a simulator instead of an empty aircraft for conversion training represents a massive saving. Pilots are able to fly a new aircraft type for the first time on a routine passenger flight (albeit alongside an experienced training captain and extra pilot), such is the quality of their preceding simulator experience. Simulators in medicine don’t offer the same degree of realism, or such obvious value for money.

Safety culture

What would transfer well to medicine? Firstly, the established Confidential Human Factors Incident Reporting Programme (CHIRP), for self reporting near misses and human errors—for example, being distracted and not following an important air traffic clearance. The National Patient Safety Agency operates a similar scheme but, unlike CHIRP, does not publish the original firsthand accounts of incidents. Secondly, a recommended procedure in an emergency situation is given by the mnemonic “DODAR”—diagnosis, options, decision, assign tasks, and review. This offers a structured framework for decision making and using resources to best effect. In particular, “review” encourages situational awareness—do my original assessment and actions still fit with the overall picture?

Doctors need to understand why certain practices work well in aviation but not necessarily in medicine. We should not be seduced by the polished image of flying or introduce unsuitable systems into a different environment. After all, pilots enjoy their job without feeling the need to mimic doctors.


Cite this as: BMJ 2011;342:c7309


  • I thank Tim Tuckey, an Airbus A320 captain.

  • Competing interests: The author has completed the unified competing interest form at (available on request from him) and declares no support from any organisation for the submitted work; no financial relationships with any organisation that might have an interest in the submitted work in the previous three years; and no other relationships or activities that could appear to have influenced the submitted work.

  • Provenance and peer review: Commissioned; not externally peer reviewed.

View Abstract