The problem with medical advice columnsBMJ 1999; 319 doi: https://doi.org/10.1136/bmj.319.7214.928 (Published 02 October 1999) Cite this as: BMJ 1999;319:928
Studies show that patients get most of their health advice from the media. Doctors, however, are fond of blaming broadcasters and journalists for inaccurate reporting and scaremongering. But is this criticism justified? After all, much of that advice is offered by doctors writing health columns.
The accuracy of information in the lay press was recently examined in Canada The study looked at a random sample of 50 advice columns on geriatric problems which were written by doctors and published in 11 different Canadian daily newspapers in 1995 (see News, BMJ 11 September, p 658). The group of geriatricians who evaluated these columns found that 28% gave potentially life threatening advice, in 22% critical issues were not clearly identified, and in 14% opinion was likely to be interpreted as fact.
Given the number of patients now arriving at general practitioners' surgeries clutching newspaper articles, it is time for a wider review of these advice columns. Various methodologies have been developed to produce measures of quality for consumer health information—such as DISCERN, an instrument for judging the quality of consumer health information on treatment choices—and some of these could help in evaluation of health columns. How do the UK newspapers' advice columns rate? I regularly read them, firstly, to try to be one step ahead of my patients, and, secondly, to see what wrong advice patients may be exposed to. A quick glance at a selection of recent columns reveals that the quality of advice is mixed.
The Sunday People has Dr Vernon's casebook looking into why “I can't find a lover man enough to satisfy me.” He responds: “By and large women are more highly sexed and more imaginative than men. They have much dirtier minds, are far less prudish and are more willing to be adventurous in bed or in the park.” Is he reporting the findings of his own unpublished, double blind, randomised controlled trial? More dangerously, in the same column is the question: “Does someone's sex drive disappear if they don't have sex for a while?” To this he replies: “Yes. Use it or lose it.” I'd like to know on what evidence he bases that statement. Don't use it but lose it might be a better way to approach Dr Vernon Coleman's advice.
From a dubious evidence base to the possibly misleading. Dr Mark Porter of the Sunday Mirror was asked how radiotherapy worked in breast cancer and what the likely side effects were. He replied: “It is an effective form of treatment in breast cancer and tends to be fairly well tolerated by most patients.” Generally true, but it is inadequate and potentially misleading because radiotherapy stops local recurrence but does not stop the spread of the cancer elsewhere, nor does it affect five year mortality.
By contrast, Dr Ann Robinson of the Guardian, in answer to a question about “swollen” breasts, writes: “Gamolenic acid may help and is very unlikely to cause side effects”—a statement that is supported by research evidence even though it is not mentioned. Dr Miriam Stoppard of the Mirror does even better when asked about a TENS machine for back pain. She explains: “Tens is beneficial in about 60% of cases, but pain relief in some people lasts only during stimulation.”
Other columnists, such as Dr Fred Kavalier of the Independent and Dr Phil Hammond of the Express, give commonsense advice that you would expect from any good general practitioner. They may not, however, generate the excitement level that Dr Rosemary Leonard of the Daily Mail does by extolling the virtues of a “New test that could discover an allergy.”
So what would improve these advice columns while maintaining their readability and avoiding dismissal of everything that isn't scientifically “pure”? Health articles, whether by doctors or journalists, can be readable, fun, and factually correct—they just need some time and research. In general, it would be useful if the writers were more specific about both the reliability of the evidence on which they were basing their assertions and the numbers of patients in broad terms who might find improvement with an intervention. A cautionary note is that the use of numbers can make an assertion seem true when there is no firm evidence. Sarah Brewer shows how in answering a question in the Daily Telegraph on the usefulness of multivitamins. She says that it is a good idea to take a vitamin and mineral supplement and quotes research involving 96 older people who took multivitamins for a year. Apparently, they had better immune responses to influenza vaccine and half as many days ill with infections compared with people not taking multivitamins. Were there sufficient patients in this trial to support these claims? Were the two groups comparable? She does not tell us.
Columnists can do more to give their readership a better understanding of the reliability and applicability of their advice, both in general and individual terms. Is this asking too much? Many doctors do have difficulty explaining evidence and risk, but that should be exactly what sets these columnists apart. As a consequence, patients might be better placed to gauge when they are being misled or misinformed Selling newspapers, however, doesn't always equate with talking sense.