Intended for healthcare professionals

Observations Reality Check

The future of medicine lies in truly shared decision making

BMJ 2013; 346 doi: (Published 14 May 2013) Cite this as: BMJ 2013;346:f2789
  1. Ray Moynihan, author, journalist, and senior research fellow
  1. 1Bond University, Australia
  1. RayMoynihan{at}

Could the public play a greater role in helping reform the healing profession?

One of the welcome shifts under way in medicine is the move towards “shared decision making,” where hubris and hierarchy give way to humility and equality.1 Part of a wider reshaping of the roles and responsibilities of patients and professionals,2 the shift is challenging the long held belief that doctors know best. Rather than experts who persuade, in the new model the professionals support people in making more informed decisions about their health. But what if we take this notion of a meeting of equals seriously? Could people help professionals to make more informed decisions? Not by offering tips to the surgeon from the operating table but rather by citizens playing a more active role in some of the big and pressing debates about the future health of medicine.

As more and more voices inside the medical establishment identify the problem of too much medicine (, a broader conversation with a wider public may be indicated. With calls to put the very culture of medicine under increasing scrutiny, perhaps we need more playwrights and poets and philosophers looking down the microscope too—searching for those unhealthy myths that may do more harm than good.

One such unhealthy myth underpins the way we commonly think about defining, diagnosing, and treating disease: the perceived dichotomy between patient and doctor. On the one hand is the rational doctor, working with objective and reliable evidence on disease and how best to treat it. On the other hand there’s the emotional patient, presenting the subjective and unreliable experience of their symptoms and illness. It’s a powerful old dichotomy, still deeply etched in our thinking about science and medicine, yet surely it demands a major reality check.

The definition of “disease” is itself highly controversial, with one camp arguing that diseases are defined by objective malfunctioning and another arguing they’re more the product of culture than of nature. The boundaries defining disease are often arbitrary: a simple line on a graph that could medicalise millions of people overnight may have been drawn by a group of tired individuals at the end of a long day in a poorly lit hotel room. Reliability of the tools to “diagnose” these “diseases” is often poor, resulting in all manner of false results and the potential for unnecessary labelling and harm. As to the data on treatments, the evidence based folks and the research on conflicts of interest have shown many examples of poor evaluation and a systemic bias in industry funded studies, which make it extremely hard to divine the difference between unreliable marketing and reliable medical information. Wider public appreciation of these uncertainties and their implications for human health is long overdue.

Similarly, medicine’s love affair with surrogate outcome measures warrants much more widespread scrutiny. Clearly surrogates can prove extremely valuable, such as in testing new treatments for life threatening new illnesses. But reliance on surrogates to demonstrate success—rather than genuine improvements in health—can be disastrous.3 Flecainide reduced irregular heartbeats but killed thousands.4 Long term hormone replacement drugs lowered concentrations of “bad cholesterol” but raised the risks of heart attacks and strokes.5 The fact that surrogates are still so prominent in medical research and regulation is another sign that we need more shared decision making—about the future of medicine.

The tendency to individualise and medicalise problems caused by a suite of social or environmental factors is another issue ripe for more vigorous debate with a much wider range of players. Should the growing “epidemic” of type 2 diabetes be conceived primarily as a medical problem caused by endocrine malfunctioning that requires mass surveillance and mass medication or a masking of the results of misguided policies on transportation, advertising standards, and workplace design?

In a sense these are the easy questions; it’s harder to know how medicine might reach outside itself more effectively. The popular work of writers such as Ben Goldacre and others is helping to bring greater public attention to these debates, particularly about relations with the industry.6 For generations the relationship has provided doctors with free food, flattery, and friendship, but now a wider friendship group might bring in fresh ideas—and not just the stale smiles of marketing reps selling spin.

Perhaps medical journals and professional associations could play more of a role in providing space for informed thinkers outside medicine. Perhaps citizens’ groups could augment the population’s health literacy to help build a more coherent and confident public voice to take part in debates about too much medicine. Citizen juries, town hall meetings, and social media networks are other obvious ways to engender discussion.

Just as when we’re sick or uncertain about our health a meeting with a doctor can offer much needed help in making a decision, confronting the unhealthy nature of medical excess and drilling deep into its causes may well be made easier by sharing decisions with a few more of those outside the surgery.


Cite this as: BMJ 2013;346:f2789


View Abstract