Dirty, deluded, and dangerousBMJ 2012; 345 doi: http://dx.doi.org/10.1136/bmj.e8330 (Published 18 December 2012) Cite this as: BMJ 2012;345:e8330
- Gary L French, honorary professor of microbiology and honorary consultant microbiologist
Obstetricians were outraged when, in 1846, Ignaz Semmelweis reduced mortality from puerperal fever in Viennese women from 16% to 3% by making doctors and medical students disinfect their hands between performing postmortems and delivering babies: they could not accept any criticism of their professional practice. Semmelweis lost his job and died in a lunatic asylum, while his dirty, deluded, and dangerous colleagues abandoned his policies, continued with their distinguished careers, and returned puerperal mortality to its previous appalling level.
Of course they did not know then, as we do now, that puerperal fever is caused by group A streptococcus, or that normal human skin is colonised by high concentrations of bacteria that transfer to the hands of staff during routine patient care and then on to other patients.1 2 They would have been shocked to discover that we now have incontrovertible evidence that hand decontamination significantly reduces the transfer of pathogens and the incidence of hospital and healthcare associated infections,2 and that Semmelweis has been vindicated.
Between the 1890s and the 1950s, the epidemiology of common bacterial pathogens was elucidated. This led to the universal introduction of standard hygiene measures, such as handwashing, no touch technique, gloving and gowning, instrument sterilisation, environmental cleaning, air filtration, the separation of beds, and the isolation of infected patients. Doctors and nurses in the 1950s were still afraid of infections: they washed their hands, made sure their hospitals were clean, and kept strictly to good hygiene practice. They were rewarded by low rates of hospital infection and a certainty that cleanliness was indeed next to godliness.
The introduction of penicillin in the 1940s and the explosion of antibiotic discovery in the 1960s had a further dramatic impact on the control of infections, allowing astonishing developments in intensive care medicine, transplantation, and surgery that earlier generations could never have imagined.
All drugs have side effects, and antibiotics came with a terrible one that doctors were too dazzled even to recognise: it made them lose their fear of infection. Infections could be cured with a squirt of antibiotic, or two squirts, or even two antibiotics. And if that didn’t work there was a whole shelf full of new agents that would. Infections had been vanquished; fever hospitals were closed and isolation rooms were reallocated. Doctors and nurses stopped washing their hands and did not protest or even notice when managers stopped cleaning wards. Also, the more relaxed social attitudes of the 1960s were at odds with the need for strictness in hygiene practice. In a startling return to the 1840s, doctors began to resent being told to be clean, and even in 1999 doctors were sending letters to the BMJ debunking the effectiveness of handwashing between routine patient contact.3 4
On average, doctors decontaminate their hands appropriately only 30% of the time,1 although they think they are much better than this. In one study, doctors thought they washed their hands between patients 73% of the time, although they actually did this only 9% of the time.5 Pritchard and Raper were astonished that “doctors can be so extraordinarily self delusional about their behaviour.”6 Doctors seem equally blind to environmental cleanliness and the need to isolate infected patients. The Healthcare Commission report on the tragic outbreaks of Clostridium difficile infection at Maidstone in 2005-6 includes truly shocking photographs of filthy wards and dirty beds that were so close to one another that they were almost touching.7 In case anyone might think this was a one-off, similar failings of infection control and hygiene practice led to a similar dreadful outbreak at Stoke Mandeville in 2005-6.8
With such practices, antibiotic resistant bacteria flourish and hospital infections soar. By 2003, hospitals in England reported more than 7000 meticillin resistant Staphylococcus aureus (MRSA) bacteraemias a year.9 Although not all resulted from poor hygiene practice, many of them did. Around 70 000 serious MRSA infections, 700 000 colonisations, and perhaps seven million failures of infection control must have occurred that year. In 2007, hospitals reported more than 55 000 cases of C difficile infection,10 most of which probably resulted from poor infection control and imprudent antibiotic prescribing.
In the end, it was the lay public, not doctors, who put pressure on politicians to call a halt to dirty hospitals and uncontrolled cross infection. Hospitals were required to publish their rates of infection, audit practice, and cleanliness ratings, and to continually reduce their infection rates or face the threat of sackings and fines. For the first time, the 2006 Health Act required healthcare institutions to have appropriate infection prevention and control in place, compliant with a code of practice.
Where decades of education and exhortation had failed, legal strictures had a dramatic impact, even on sceptical doctors, just as they had done on sceptical smokers and drivers who refused to wear their seat belts. Doctors and nurses were effectively forced to behave, and by 2011 MRSA bacteraemias in English hospitals had fallen by around 86% (from 7700 in 2003-04 to 1114 in 2011-12) and C difficile infections by 68% (from 55 498 in 2007-08 to 18 005 in 2011-12),9 10 with associated reductions in mortality.11 12 This is one of the most dramatic demonstrations of the effectiveness of good infection control practice (or just good clinical practice) in the medical literature, and it seems to have produced a genuine change in culture. Just as drivers now always use their seat belts and smokers never light up indoors, many doctors now decontaminate their hands between patients without thinking and chastise their colleagues who forget.
However, there are still dirty wards, patients who should be isolated, imprudent antibiotic prescribing, unwashed hands, and many avoidable infections. Some doctors remain sceptical and, like Semmelweis’s colleagues all those years ago, still refuse to accept that they may themselves be part of the problem. Christmas is coming with its judgment of the naughty and nice: time to believe and be good.
Cite this as: BMJ 2012;345:e8330
Competing interests: The author has completed the ICMJE uniform disclosure form at www.icmje.org/coi_disclosure.pdf (available on request from the corresponding author) and declares: no support from any organisation for the submitted work; no financial relationships with any organisations that might have an interest in the submitted work in the previous three years; no other relationships or activities that could appear to have influenced the submitted work.
Provenance and peer review: Commissioned; not externally peer reviewed.