Five pitfalls in decisions about diagnosis and prescribing
BMJ 2005; 330 doi: https://doi.org/10.1136/bmj.330.7494.781 (Published 31 March 2005) Cite this as: BMJ 2005;330:781All rapid responses
Rapid responses are electronic comments to the editor. They enable our users to debate issues raised in articles published on bmj.com. A rapid response is first posted online. If you need the URL (web address) of an individual response, simply click on the response headline and copy the URL from the browser window. A proportion of responses will, after editing, be published online and in the print journal as letters, which are indexed in PubMed. Rapid responses are not indexed in PubMed and they are not journal articles. The BMJ reserves the right to remove responses which are being wilfully misrepresented as published articles or when it is brought to our attention that a response spreads misinformation.
From March 2022, the word limit for rapid responses will be 600 words not including references and author details. We will no longer post responses that exceed this limit.
The word limit for letters selected from posted responses remains 300 words.
Two points:
Firstly 'Groupthink' (as I understand it) was a term originally
coined to describe another faulty decision making process in which a group
led by a charismatic individual and with little external input is prone to
errors in logic and decision making (e.g. JFK and the Bay of Pigs fiasco,
or consultant surgeon and group of awed juniors)
Secondly - a previous contributor has emphasised the work of
Gigerenzer and I simply wish to add my recommendation to theirs: read
Reckoning with Risk - it is an eye-opener.
Competing interests:
None declared
Competing interests: No competing interests
Interesting paper and I would not question its
conclusions.
But what it does not explore is the "corrective power" of
groups of doctors working in series or tandem on the same
patient - tending to cancel out the other person's biases.
I am sure that this groupthink plays just as important a
role in medical decision making as the biased individual
decison maker.
Competing interests:
None declared
Competing interests: No competing interests
Dr. Klein has made a well-intentioned effort to advise medical
decision makers about potential heuristic-based biases they might
experience. Many of her cautions are well-taken. Yet the treatment given
to the five well-known potential pitfalls does not do justice to the
available research on these topics.
Dr. Klein's review is loosely based on Kahneman and Tversky's early
work on judgment under uncertainty (1). This is only one of several
foundations (albeit a key one with a Nobel prize) of the massive body of
judgment and decision making research from the last 30 years. Many
important extensions, clarifications, and parallel developments (as well
as challenges) to Tversky and Kahneman's work now exist.
Two primary examples are Evans and colleagues work on bias in human
reasoning (2), and Gigerenzer and colleagues program of research on
heuristics and interpretation of risk (3). One needs to go no further than
the pages of BMJ to find a practical overview by Gigerenzer and Edwards
(4) on the problem of errors commonly made by doctors and patients in
interpreting conditional probabilities. This overview includes a tested
strategy for ameliorating the base-rate fallacy that Dr. Klein described
under Pitfall 1. According to the authors, "the inability to understand
statistical information is not a mental deficiency of doctors or patients
but is largely due to the poor presentation of the information." (p. 744).
The solution? Present information, such as the probability of a disease
given a positive diagnostic test, as natural frequencies rather than the
usual conditional probabilities. While the authors acknowledge that this
is not a panacea, they have reported remarkable effectiveness in
substantially reducing interpretation errors across a variety of
diagnostic situations (3).
Another BMJ article has previously reviewed the cognitive processes
involved in diagnostic reasoning (5). This review, in my opinion, does
better justice than the current article to some of the same biases to be
avoided.
References
1. Kahneman D, Slovic P, Tversky A, ed. Judgment under uncertainty:
heuristics and biases. Cambridge: Cambridge University Press; 1982.
2. Evans J. Bias in human reasoning: causes and consequences. London:
Lawrence Erlbaum Associates; 1989.
3. Gigerenzer G. Calculated risks: how to know when numbers deceive
you. New York: Simon and Shuster; 2002.
4. Gigerenzer G, Edwards A. Simple tools for understanding risks:
from innumeracy to insight. BMJ 2003;327: 741-4.
5. Elstein AS, Schwarz A. Clinical problem solving and diagnostic
decision making: selective review of the cognitive literature. BMJ
2002;324: 729-32.
Competing interests:
None declared
Competing interests: No competing interests
Jill,
Congratulations on addressing a problem which pervades all complex
decision processes. I hope this paper will be read by attorney's general
everywhere when they consider tort law reform to avoid the ongoing
destruction of health care by avarice.
Decision errors are inevitable, and systematic, and are detected only by
rigorous personal review, and peer review or the occurrence of a major
adverse event. Recurrence can be reduced through education and training,
both factual and process, neither of which occur in the current, punitive,
adversarial system.
Bravo.
Competing interests:
None declared
Competing interests: No competing interests
I really enjoyed reading this paper and hope that awareness of the
issues will guide my decision making. However, this paper does little to
put the heuristics of decision making in the context of the imperfect
evidence base that doctors have to use to muddle through.
For example the author states that “Homoeopathy provides an excellent
example of illusory correlation.” Based on the statement that “no
convincing evidence exists that homoeopathic treatments are effective.”
Readers accepting this statement would surely be falling into pitfall 3;
overconfidence in the reliability of the evidence base. Firstly I suspect
that homeopathically trained Doctors would challenge the statement and
secondly all others would accept the distinction between lack of evidence
of effectiveness as opposed to evidence of lack of effectiveness. Our
evidence base is far from perfect and failure to recognise this is a
barrier to good decision making.
In the conclusion the author refers to “distortions and biases in the
way information is gathered and assimilated” It is lovely that the author
provides us with a working example, as this paper itself is distorted and
biased for the following reason. It fails to address the extent to which
errors in decision making are due to biases of interpretation by the
individual and to what extent they are due to biases in the information
available to us.
Perhaps a useful heuristic principle for doctors to use is to never
trust a professor of marketing, writing a company funded paper, to give
you an unbiased perspective.
Competing interests:
None declared
Competing interests: No competing interests
This article could be summarised in that we are
Tempted towards a judgemental conclusion because as we know we are all
tempted to judge,
Tempted towards what we have experienced because that rewards our
experience and makes us feel competent,
Prone to not keeping an open mind at every step because patients are not
comfortable with doctors who keep to open a mind as they look less
decisive,
Tempted towards convergence because it reduces the new avenues to explore
and
Tempted towards early linkage of facts because it makes us feel we are
rewarding our intelligence.
The problem is that in primary care, patients, colleagues and
politicians want quick decisions and much of the "better" processes
mitigate against speed. Society and people want it all. They want
efficient availability but they want considered decisions which can stand
the cold light of endless re-examination when something goes wrong.
Will the government pay for a service that can allow physicians to
avoid the errors mentioned in this article? If not will they take on board
that we need to commit these "errors" to get through the day?
Competing interests:
None declared
Competing interests: No competing interests
Mr Sarkies: the quote is accurate, from "As You Like It", Act 5 scene
1.
Competing interests:
None declared
Competing interests: No competing interests
I have searched for the Shakespeare quotation 'The fool doth think he
is wise, but the wise man knows himself to be a fool' in my Oxford
Quotations without success. The best match was:
'Yet with great toil all that I can attain
By long experience, and in learned schools,
Is for to know my knowledge is but vain,
And those that think them wise, are greatest fools.'
Sir William Alexander, Earl of Stirling (1567-1540),
The Tragedy of Croesus
Competing interests:
None declared
Competing interests: No competing interests
It is amazing that an author who writes about bias is not aware of
his/her own possible bias. Jill Klein demonstrates homeopathy as an
example of illusory correlation. But you have to read literature with
considerable confirmation bias to use it as proof for this purpose.
Vandenbroucke defied the medical community to produce a conventional
method with better proof than homeopathy.(1) Fundamental research is
improving and rendering remarkable proof that homeopathy is more than just
placebo effect.(2) I suspect that Klein is convinced that homeopathy
cannot work, as I was when I started in general practice. But many
patients experienced benefit from homeopathy where I could not help them.
Then I tried the method myself and observed reactions that I did not
expect. In general, doctors that try homeopathy are amazed, they can help
patients that they could not help before.
Klein reverses things, patients show doctors that homeopathy might work,
doctors on the other hand are reluctant to accept this conclusion. They
cling to the hypothesis that medicines can only work by molecular
interactions. They refuse to consider other options. Brownian motion, the
fact that molecules hover in a medium defying gravity, can illustrate that
this is a mistake. It took many theories and much more experiments before
Einstein and Smoluchowski solved this problem theoretically and Perrin
experimentally.(3) The starting point for this research was believing our
own eyes. Why can’t we believe our patients?
Lex Rutten, homeopathic physician
References
1. Vandenbroucke JP. Medical journals and the shaping of medical
knowledge. Lancet 1998;352:2001-6
2. Belon E, Cumps J, Ennis M, Mannaioni PF, Robertfroid M, Sainte-Laudy
J, Wiegant FAC. Histamine dilutions modulate basophil activation. Inflamm.
res. 2004; 53: 181-8
3. Mayo DG. Error and the growth of experimental Knowledge. TheUniversity
of Chicago Press. Chicago 1996
Competing interests:
homeopathic physician
Competing interests: No competing interests
Beyond pitfalls in medical decision-making: adaptive heuristics and customized solutions
It is most welcome to see an article addressing decision making in
clinical practice in a major journal. The emergence of a clear picture of
the scale of preventable harm to patients in hospital mandates a critical
look at care provision(1). This applies on all levels from the
organisational/systems level down to optimising safe practice by
individual clinicians. The extent of the contribution of diagnostic error
and decision-making to the patient safety problem is as yet unknown, but a
significant role is likely. After all, the integration of complex
information - under the inherent conditions of uncertainty in medicine -
can be a demanding and difficult task. We felt there were a number of
issues arising from Dr. Klein’s article that warrant discussion(2).
Firstly, the paper would have benefited from a discussion (even a
brief one) of some mechanisms that have been shown to be at work in expert
decision-making, such as pattern recognition. Strategies such as
Recognition-Primed Decision-making(3) or usage of clinical case-based
“scripts”(4) are instantly recognisable to practicing clinicians, and
describing them would have made the article appear more relevant.
Secondly, the article focused heavily on the negative aspects of
heuristics (shortcut reasoning strategies) and biases (predictable
deviations from optimal decision-making that appear to be “hard-wired”
into human cognition). The “heuristics and biases” tradition in human
decision-making has certainly received strong backing by empirical
data(5,6). However, there have also been attempts to view heuristic
reasoning in a more positive light. For instance, despite the fact we can
be led astray by heuristic shortcuts, there are a number of decision-
making instances in which these same heuristics provide a fast and
efficient means to achieving good decision outcomes(7). Not mentioning the
benefits of heuristic-based reasoning gave the article a rather
critical tone that may not help to engage clinicians in the debate.
Finally, the article implied that biases in medical reasoning can be
solved by training: if clinicians know that the biases exist, they will be
less susceptible to their influence. Empirical evidence, however,
questions this conclusion. Attempts at de-biasing by training have at best
been only partially successful. Moreover, there is evidence that some de-
biasing effects decay over time. De-biasing may even have the undesirable
effect of actually impeding optimal use of the relevant information (for
reviews of de-biasing, see Fischhoff(8), Wilson & Brekke(9)). We take
the view that, given the range of contexts and types of clinical
decisions, a creative approach to matching solutions to settings must be
employed. The solutions that such an approach would yield may vary widely,
from training to computerised decision support.
References
1. Vincent, C., Neale, G., Woloshynowych, M. Adverse events in
British hospitals: Preliminary retrospective record review BMJ 2001,
322:517-519
2. Klein JG. Five pitfalls in decisions about diagnosis and
prescribing. British Medical Journal 2005;330:781-783
3. Klein, G. A. (1993). A Recognition-Primed Decision (RPD) model of
rapid decision making. In G. A. Klein, J. Orasanu, R. Calderwood, & C.
E. Zsambok (Eds.), Decision Making in Action: Models and Methods (pp. 138-
147). Norwood, NJ: Ablex Publishing Corporation.
4. Charles M. Abernathy and Robert M. Hamm (1995) Surgical Intuition
(pp. 93-124). Philadelphia: Hanley & Belfus Inc, distributed by Mosby.
5. Gilovich, T., Griffin, D. W., & Kahneman, D. (2002).
Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge
University Press.
6. Kahneman, D., Slovic, P. & Tversky, A. (1982). Judgment under
Uncertainty: Heuristics and Biases. Cambridge University Press.
7. Gigerenzer, G., Todd, P. M., & the ABC Research Group (1999).
Simple Heuristics that Make Us Smart. Oxford University Press.
8. Fischhoff, B. (1982). Debiasing. In D. Kahneman, P. Slovic, &
A. Tversky (Eds.), Judgment under Uncertainty: Heuristics and Biases (pp.
422-444). Cambridge University Press.
9. Wilson, T. D., & Brekke, N. (1994). Mental contamination and
mental correction: Unwanted influences on judgments and evaluations.
Psychological Bulletin, 116, 117-142.
Competing interests:
None declared
Competing interests: No competing interests