Intended for healthcare professionals

Clinical Review

Explaining risks: turning numerical data into meaningful pictures

BMJ 2002; 324 doi: (Published 06 April 2002) Cite this as: BMJ 2002;324:827
  1. Adrian Edwards, senior lecturer (Edwardsag{at},
  2. Glyn Elwyn, senior lecturera,
  3. Al Mulley, chief of general medical divisionb
  1. a Department of General Practice, University of Wales College of Medicine, Llanedeyrn Health Centre, Llanedeyrn, Cardiff CF3 7PN
  2. b Massachusetts General Hospital, 50 Staniford Street, 9th Floor, Boston MA 02114, USA
  1. Correspondence to: A Edwards

The way in which information is presented affects both how health professionals introduce it and how patients use it

The “information age” has profound implications for the way we work. The volume of information derives from biomedical and clinical evaluative sciences and is increasingly available to clinicians and patients through the world wide web.1 We need to process information, derive knowledge, and disseminate the knowledge into clinical practice. This is particularly challenging for doctors in the context of the consultation. Information often highlights uncertainties, including collective professional uncertainty, which we address with more and better research; individual professional uncertainty, which we address with professional education and support for decisions; and stochastic uncertainty (the irreducible element of chance), which we address with effective risk communication about the harms and benefits of different options for treatment or care.

In this article we discuss whether the shift towards a greater use of information in consultations is helpful and summarise the current literature on risk communication. We also explore how information can be used without losing the benefits that are traditionally associated with the art, rather than the science, of medicine.

Summary points

Patients often desire more information than is currently provided

Communicating about risks should be a two way process in which professionals and patients exchange information and opinions about those risks

Professionals need to support patients in making choices by turning raw data into information that is more helpful to the discussions than the data

“Framing” manipulations of information, such as using information about relative risk in isolation of base rates, to achieve professionally determined goals should be avoided

“Decision aids” can be useful as they often include visual presentations of risk information and relate the information to more familiar risks



This paper draws on systematic reviews and other key literature in the field.2 3 4 5 6 7 We reviewed literature addressing shared decision making for communicating risks, supporting patients' decisions, and the specific issue of risk communication about cancer.4 5 6 7 We also appraised key reviews outside the healthcare setting.8 9 10

Definition of risk communication

Risk communication is defined as the open two way exchange of information and opinion about risk, leading to better understanding and better decisions about clinical management.2 3 4 5 6 7 8 9 10 11 This definition moves away from notions that information is communicated only from clinician to patient, and that acceptability (or not) of the risk is communicated back. The two way exchange about information and opinion is important if decisions about treatment are to reflect the attitudes to risk of the people who will live with the outcomes.

The problems of risk language

Terms such as probable, unlikely, rare, and so on have been shown to convey “elastic” concepts. 12 13 One person's understanding of “likely” may be a chance of 1 in 10, whereas another may think that it means a chance of 1 in 2. Any one person may also interpret the term differently in different contexts—a “rare” outcome is a different prospect in the context of genetic or antenatal tests than, for example, in the context of antibiotic treatment for tonsillitis.

Interpretation of numerical information is problematic. For example, Yamagishi found that death rates of 1286 out of 10 000 were rated as more risky than rates of 24.14 out of 100.14 In addition, the interpretation of the probabilistic elements of risk cannot be divorced from the importance of the harm, which includes the meaning of the harm and its implications for lifestyle and health (such as the threat of cancer). But people's interpretations of a condition and its burden also vary—for example, a stroke may mean different things to people according to their personal experience. The context of the risk is also important.15 Risks may be voluntary or imposed; they may be familiar or unknown, which may affect the degree of dread; and they may be concentrated or dispersed over time. These characteristics affect the way people interpret information on risk, and the way they discount the risks (or not) over time affects their responses to them.

Proposals have been made to standardise the language of risk. Calman suggested a scale with standardised terms for specified frequencies (for example, “high” for risks 1 in less than 100 and “moderate” for between 1 in 100 and 1 in 1000).16 Professionals and the public could become familiar with these standards, and this could promote a more accurate perception of risk. Paling made a similar proposal, a derivation of which is shown in fig 1. He developed this theme further, comparing pictorial representations of risk with everyday more familiar risks.17 The pictorial representation of risk could have many advantages, but language evolves and is not static. Patients would probably not interpret such standardised terms consistently.

Fig 1
Fig 1

Risk language proposal, derived from Paling17

Framing effects

Further problems in communicating risks result from the effects of different information frames. Framing manipulations are defined as the description of logically equivalent choice situations in different ways, such as advising patients about prognosis by using either survival or mortality data.8 Box 1 summarises the effects of various framing manipulations.

We can be persuasive with information. Pharmaceutical companies use persuasive techniques to present effects of their drugs to professionals. A survey of publicity of mammography for patients also found that, to encourage uptake, only information on relative risk was presented, undoubtedly because this information is “effective.”18 Perhaps this is justifiable in some situations to achieve the greatest public health gain. But without the whole truth, presenting information in such a way is not consistent with truly informed decision making.6 We note the US National Research Council's recommendation to seek strictly to inform, unless conditions clearly warrant use of influencing techniques. Such conditions occur only rarely in clinical encounters.

Box 1: The effects of framing and other manipulations3

Information on relative risk is more persuasive than absolute risk data

“Loss” framing (for example, the potential losses from not having a mammogram) influences screening uptake more than “gain” framing

Positive framing (for example, chance of survival) is more effective than negative framing (chance of death) in persuading people to take risky options, such as treatments

More information, and information that is more understandable to the patient, is associated with a greater wariness to take treatments or tests


Principles for future communication of risks

Information on risk must be presented in a balanced manner. Information on relative risk should not be presented in isolation. Some writers advocate presenting information just on absolute risk.19 This can take the form of percentage terms or be converted into integers (such as a chance of 1 in 10). Other formats include the number needed to treat (NNT) and the number needed to harm (NNH). Further derivations of this concept include the number needed to screen, the number needed to test, and the number (of tablets) needed to take (NTNT) to prevent an adverse outcome.19 Empirical data on the value of these terms in discussions with patients are sparse.

It seems most justifiable to use both absolute and relative risk formats. A sense of scale is difficult when risks decrease beyond the commonplace, such as 1 in 100, 1 in 1000, or beyond. A comparative frame of reference (as data on relative risk provide) makes it easier to judge levels of risk. People can make good use of information that is presented simply and effectively.20

Risk information relevant to individuals is more valuable than average population data.2 For example, an individual's risk of future coronary heart disease can be calculated by using information on risk factors (such as age, hypertension, cholesterol, smoking status, or diabetes) from readily available charts.21 The Gail formula for breast cancer risk is another example, but other conditions in which such calculations are readily feasible are limited.22

Information must be presented clearly. Sometimes numerical data alone may suffice. The visual presentation of risk information has also been explored. Some empirical studies suggest that many patients prefer simple bar charts to other formats such as thermometer scales, crowd figures (for example, showing how many of 100 people are affected), survival curves, or pie charts; other studies have found that people may prefer presentations that lead them to less accurate perceptions of risk. 23 24

Care is required to avoid an overload of information. Most patients, when asked, express a strong desire for information.25 But people's ability to assimilate information varies. It may sometimes be possible to provide and discuss information over several consultations.

Putting principles into practice

Putting these principles into practice is within reach, provided that professionals are aware of the pitfalls (for example, framing effects). Decision aids include booklets, tapes, videodiscs, interactive computer programs, or paper based charts, to help presentation and discussion of risk information with patients.7 Websites portray the harms or benefits associated with treatment options. An example is the treatment of otitis media with antibiotics ( (fig 2).

Fig 2
Fig 2

Portrayal of risks and benefits of treatment with antibiotics for otitis media designed with Visual Rx, a program that calculates numbers needed to treat from the pooled results of a meta-analysis and produce a graphical display of the result

Alternatively, paper based graphs can be used to support discussions with patients (fig 3).26 Booklets, tapes, and other materials may often not be necessary but could be available if patients wish. Patients could also be referred to reference sources such as the Database of Individual Patient Experiences, the UK National electronic Library for Health, and Health Crossroads (box 2). Health Crossroads explores multiple decision points at different times in the patient's “career.” Opportunities are provided to explore scenarios—for example, the harms and benefits of a treatment option alongside the converse harms and benefits from not choosing it.

Box 2: Examples of patients' reference points

Database of Individual Patient Experiences (—contains patients' experiences, including narratives of decision pathways

UK National electronic Library for Health (—includes different levels of information for patients

Health Crossroads (—audio overviews for over 70 decisions (or crossroads) in different conditions including breast cancer, benign uterine conditions (such as fibroids), prostate disease, coronary artery disease, and back pain.

Fig 3
Fig 3

Portrayal of the risks and benefits of hormone replacement taken for five years26

Time remains a problem in the short term. External sources may enable much of the information gathering to occur outside consultations. Discussions will still be required, but an investment of time at the decision making stage may result in more succinct discussions in the future—and then save time.

Is this the direction we wish to take?

The developments in how information is presented offer opportunities to put substance into commonplace healthcare discussions. But does this swing the balance away from the art of medicine? Will it become less of a “high touch” discipline, in which professionals try to support patients through episodes of illness, and more of a “high tech” one, in which reductionist approaches see pathways of illness as a series of dilemmas that can be “solved”? There may be intangible, even mysterious, value in the softer art of medicine—is this being endangered?

In answer to this, developments in risk communication are not necessarily opposed to these values (although we should beware). It may be helpful to discuss the frequencies of outcomes but still leave room to explore uncertainties that persist. In some scenarios, tests may genuinely increase uncertainty.27 For example, discussion of the pros and cons of testing for prostate specific antigen may increase uncertainty in patients once they are aware of the test's false positive and false negative rates.

The importance of the risks to patients also varies. As patients live through the risk, or the condition if it occurs, they may value the supportive or caring contribution from the same healthcare professionals who have shared information with them.28 Also, we should not assume that decisions are always made on a rational basis.29 Descriptive decision theory says that people regularly make decisions that are not compatible with maximising expected utility. Most of the data available also entail uncertainties. Honesty about this may enhance the role of and respect for professionals, not diminish them. A healthcare professional in the 21st century may be able to be supportive (at times an “agent” for the patient30) but also be an “enabler” and someone to facilitate informed choices.


Certain recommendations for designing materials informing about risk can already be implemented (box 3). The information should be simple and balanced. Comparison with “everyday risks” with which people are familiar may help to present risk data.17 We need to synthesise the current evidence on patients' preferences for different information formats and assess the effects of various formats. Epidemiological work is required to enable calculation of individualised risk estimates across a wide range of clinical conditions. Then further practical decision aids (including digital and interactive ones) can be developed that professionals could have available in their consultations. Evaluation of how patients use these and how professionals can advise their patients alongside this information will be needed.

Box 3: Design of risk information formats for patients

Graphical displays of information increase the effectiveness of risk communication

Simple bar charts may be preferred over “representations” (faces, stick figures, etc)

Avoid using areas or volumes to depict quantities

Absolute risks (with appropriate scales) should be given greater prominence than relative risks—in both information for patients and journals for professionals

Lifetime risks should be given, with relevant information about risks in relevant time spans as additional information

The influence of framing should be countered by using dual representations (loss and gain, mortality and survival data)

Be clear about whether the task is to read an exact value, compare two risks, assess trends, or judge proportions; a display should provide the relevant but minimum information needed for these tasks

Comparison with everyday risks is valuable, such as where the risk (for example, stroke in atrial fibrillation) is compared with other well known risks (for example, road crashes). These comparisons should be integrated into patient information materials


We thank John Paling and Chris Cates for permission to use their material in the figures.


  • Competing interests AM has received consulting fees and salary support from the Foundation for Informed Medical Decision Making, and has a royalty agreement with the foundation.


  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
  21. 21.
  22. 22.
  23. 23.
  24. 24.
  25. 25.
  26. 26.
  27. 27.
  28. 28.
  29. 29.
  30. 30.
View Abstract