Intended for healthcare professionals

Rapid response to:

Practice Practice Pointer

Five strategies for clinicians to advance diagnostic excellence

BMJ 2022; 376 doi: https://doi.org/10.1136/bmj-2021-068044 (Published 16 February 2022) Cite this as: BMJ 2022;376:e068044

Rapid Response:

The Second Elephant

Dear Editor

The Second Elephant

With the emergence of the Patient Safety movement at the turn of this century, one might be forgiven for believing that diagnostic failure was comparatively uncommon and other threats to patient safety more deserving of attention. Diagnostic failure was mentioned twice in the landmark Institute of Medicine report To Err Is Human,[1] whereas medication errors were mentioned 70 times [2] - not entirely invisible but clearly an elephant in the room. While the experience of many physicians revealed a significant problem in clinical practice, for a variety of reasons it did not see the light of day.[3]

With the formation of the Society to Improve Diagnosis in Medicine (SIDM) in 2011, the elephant has thankfully come into view and many of its parts now clearly visible. But there remains a problem with identification. To use a second elephant metaphor, in the ancient Hindu parable of 6 blind men attempting to determine the nature of the elephant by touch, each interprets it differently on the basis of their training and experience. The moral of the parable being that we judge things based on our own experience and orientation, often ignoring the assistance and interpretations of others.[4]

The response from mainstream medicine towards diagnostic failure is that if we are experiencing difficulty, we should try harder and acquire more medical knowledge. However, while knowledge is indispensable, it will not realise its full potential if we are limited in how we use it. Others who might help with identifying the important parts of the elephant critically include cognitive scientists, who might throw some light on where the failing occurs. Chess players, for example, do not have difficulty recognising the pieces (the tools), nor difficulty in remembering the role of each piece (the rules), the challenge lies in thinking through the possibilities – generating hypotheses, consequences, testing them for the optimal move, and avoiding catastrophic loss through making a wrong move. If a patient presents to an emergency physician with subtle or atypical signs of a dissecting aneurysm, a stroke, or a spinal abscess, the thinking part of the diagnosis will not take place unless there is a deeper insight into the diagnostic process, and an awareness that more information may need to be unpacked when there is uncertainty. Applying a cognitive forcing function such as rule-out-worst-case-scenario (ROWS) would at least get it onto the differential.

A major challenge in the diagnostic process are cognitive failures of the clinician.[5,6] The human brain predictably fails in many aspects of cognition: in the interpretation of our perceptions, what we see, feel and hear, in our memory and recall, our vulnerability to numerous logical fallacies, and a variety of cognitive and affective biases. These have all been addressed in a voluminous cognitive science literature over the last century, especially in the last 40 years. Singh et al propose five strategies to advance diagnostic excellence, and list the consideration of bias as one of them.[7] Though cognitive bias is acknowledged, most of their focus is on biases associated with ‘race, ethnicity, gender, and other identities’. While these are of paramount importance and need to be addressed, such biases are social and/or sociological in nature and not the typical bias that confounds most of diagnostic judgment and decision making (JDM). JDM biases in clinical decision making are typically anchoring, search satisficing, unpacking failure, confirmation bias, availability, and many others,6 all exquisitely individual in nature, and universal. While both groups of biases describe replicable patterns of inaccurate judgment, illogical interpretation, and irrationality, and while there may be overlap between the two groups, they are categorically different. Keeping them separate conceptually will probably help in terms of their mitigation.

Cognitive science tells us a lot about how we make decisions, it constitutes a specialized domain of expert knowledge on the subject. Yet, we appear to be a long way from acknowledging its role in clinical decision making, especially that which bears on the very complicated and complex diagnostic process. Add to this the disinclination of medical educators to include formal training in clinical decision making in the majority of medical undergraduate curricula and we can begin to see why many diagnoses fail. A survey in the US of members of the Clerkship Directors in Internal Medicine found that more than half of the institutions lacked sessions dedicated to key clinical reasoning concepts, even though the majority view was that a structured curriculum in clinical reasoning should be taught in all phases of medical education, and that more curricular time be made available for the topic. Not surprisingly, lack of curricular time and faculty with expertise to teach these concepts appeared to be the main barriers to their inclusion in the curriculum.[8]

Why explicit training in clinical decision making doesn’t make it into the medical curriculum is puzzling. In the past, medicine has recognised and corrected some of its other shortcomings. For example, in the 1970s the statistical methods used in medical studies came under increasingly serious challenge. Inappropriate methods were being used and wrong conclusions drawn from the data. In one report, it was estimated that more than 50% of medical studies were statistically flawed. Not only were statisticians concerned but physicians, too, were increasingly worried about having to use statistical techniques they did not fully understand.[9,10] This led to a consensus among regulatory authorities that statistics should be an integral part of medical education, and in the UK, the General Medical Council made a recommendation in 1980 that statistics be included in the training of doctors. Henceforth, all UK medical schools included training in statistics within the medical education syllabus.[10] Cognitive science is no less a challenge than statistics and we do appear to need outside help in clinical decision making. It is, arguably, the most important of clinical skills and there is emerging evidence that it can be successfully taught. [11,12,13] Calls have been made for explicit cognitive science input into medical education. [14,15,16,17]

References
1. Kohn, LT. Corrigan, JM and Donaldson, MS. (Eds.) To Err is Human: Building a Safer Health Care System. Report of the Institute of Medicine of the National Academies. 1999; Washington: National Academy Press.
2. Wachter RM. Why diagnostic errors don’t get any respect – and what to do about them. Health Affairs. 2010; 29: 1605-10.
3. Croskerry P. Perspectives on diagnostic failure and patient safety. Healthcare Quarterly, 2012; 15: 50-56
4. https://en.wikipedia.org/wiki/Blind_men_and_an_elephant (accessed 25 February 2022)
5. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005; 165:1493-9. doi: 10.1001/archinte.165.13.1493 pmid: 16009864
6. Croskerry P. The Cognitive Autopsy: A Root Cause Analysis of Medical Decision Making. 2020; New York, New York: Oxford University Press.
7. Singh H, Connor DM, Dhaliwal G. Five strategies for clinicians to advance diagnostic excellence. BMJ 2022; 376: e068044 http://dx.doi.org/10.1136/bmj-2021-068044.
8. Rencic J, Trowbridge RL, Fagan M, Szauter K, Durning S. Clinical Reasoning Education at US Medical Schools: Results from a National Survey of Internal Medicine Clerkship Directors. J Gen Intern Med, 2017; 32:1242–6 DOI: 10.1007/s11606-017-4159-y
9. Altman DG, Bland JM. Improving doctors’ understanding of statistics. J R Stat Soc Ser A 1991; 154:223-67.
10. Appleton DR. What statistics should we teach medical undergraduates and graduates. Statist. Med.,1990; 9: 1013-1021.
11. Ruedinger E, Mathews B, Olson APJ. Decision–diagnosis: An introduction to diagnostic error and medical decision-making. MedEdPORTAL. 2016; 12:10378. https://doi.org/10.15766/mep_2374-8265.10378.
12. Hunzeker A, Amin R. Teaching cognitive bias in a hurry: Single-session workshop approach for psychiatry residents and students. MedEdPORTAL. 2016; 12:10451. https://doi.org/10.15766/mep_2374-8265.10451.
13. Bonifacino E, Follansbee WP, Farkas AH, Jeong K, McNeil MA, DiNardo DJ. Implementation of a clinical reasoning curriculum for clerkship-level medical students: a pseudo-randomized and controlled study. Diagnosis 2019; 6: 165–172. https://doi.org/10.1515/dx-2018-0063
14. Elstein AS. Thinking about diagnostic thinking: a 30-year perspective. Adv Health Sci Educ Theory Pract 2009; 14:7–18. doi:10.1007/s10459-009-9184-0
15. Redelmeier DA, Ferris LE, Tu JV, Hux JE. Schull MJ. Problems for clinical judgement: introducing cognitive psychology as one more basic science. CMAJ. 2001; 164: 358-360.
16. Croskerry P, Nimmo GR. Better clinical decision making and reducing diagnostic error. J R Coll Physicians Edin. 2011; 41:155-162.
17. Royce CR, Hayes MM, Schwartzstein RM. Teaching Critical Thinking: A Case for Instruction in Cognitive Biases to Reduce Diagnostic Errors and Improve Patient Safety. Academic Medicine. 2019; 94 : 187-194

Competing interests: No competing interests

09 March 2022
Pat G Croskerry
Physician
Mike Clancy MB ChB M Sc
Dalhousie University Medical School
Continuing Professional Development Medical Education, Faculty of Medicine, Dalhousie University, PO Box 15000 Halifax, Nova Scotia B3H 4R2 Canada