Intended for healthcare professionals

Editorials

Missing clinical trial data: setting the record straight

BMJ 2010; 341 doi: https://doi.org/10.1136/bmj.c5641 (Published 12 October 2010) Cite this as: BMJ 2010;341:c5641

This article has a correction. Please see:

  1. Fiona Godlee, editor1,
  2. Elizabeth Loder, associate editor1
  1. 1BMJ, London WC1H 9JR
  1. fgodlee{at}bmj.com

Urgent action is needed to restore the integrity of the medical evidence base

Like us, you have probably grown accustomed to the steady stream of revelations about incomplete or suppressed information from clinical trials of drugs and medical devices.1 If so, this issue of the BMJ features a pair of papers that will dismay but not surprise you. Researchers for an official German drug assessment body charged with synthesising evidence on the antidepressant reboxetine encountered serious obstacles when they tried to get unpublished clinical trial information from the drug company that held the data, an experience from which they draw several lessons (doi:10.1136/bmj.c4942).2

Once they were able to integrate the astounding 74% of patient data that had previously been unpublished, their conclusion was damning: reboxetine is “overall an ineffective and potentially harmful antidepressant” (doi:10.1136/bmj.c4737).3 This conclusion starkly contradicts the findings of other recent systematic reviews and meta-analyses published by reputable journals.4 5 6 7 8 These studies presumably met prevailing standards for the conduct of meta-analyses. Yet we now know that they did not provide a properly balanced view of the harms and benefits of reboxetine. Why? Because they did not combine all of the existing evidence from clinical trials. Furthermore, the difficulties encountered by Wieseler and colleagues in obtaining the reboxetine data show that routine inquiries about missing information, which many authors of meta-analyses make, are probably insufficient.9 Instead dogged, even heroic, persistence is required, as the Cochrane reviewers trying to untangle the evidence for oseltamivir have found.10 11

Research that is conducted but not reported is only part of the problem. Steinbrook and Kassirer point to the rosiglitazone (Avandia) story as an example of problems arising from incomplete access of researchers and others to the raw data within a trial.12 Problems also arise, they say, with the way in which these data are interpreted or adjudicated. They call for journals and editors to do more, including reserving the right to inspect trial data themselves. This is a contentious topic. Commentator Chris Del Mar applauds this stand,13 but Nick Freemantle points out that although it is easy to call for unfettered access to data, it is another thing entirely to provide and make use of it.14

The reboxetine story and similar episodes must call into question the entire evidence synthesis enterprise. Meta-analyses are generally considered the best form of evidence, but is that a plausible world view any longer when so many of them are likely to be missing relevant information?15 Existing estimates of treatment benefits are not always altered when previously unpublished clinical trial data become available. At present, however, we do not know the extent to which integration of missing data would support or refute key portions of the existing evidence on which doctors, patients, and policy makers rely.

As Wieseler and colleagues point out, the Food and Drug Administration Amendments Act of 2007 and parallel European efforts will increase the accessibility of clinical trial results and make it more difficult to conceal information.2 But they do not solve the problem of our current evidence base, which contains incomplete and questionable evidence. So what can be done? At the moment there are no organised efforts to identify missing information and integrate it into the existing evidence base.

The BMJ has a particular interest in the impact of unpublished data on the overall verdict regarding the effectiveness of medical treatment. Because we think that it is important to re-evaluate the integrity of the existing base of research evidence, the BMJ will devote a special theme issue to this topic in late 2011. A detailed call for papers will follow, but we mention this now because we hope that researchers with such projects under way will feel encouraged. We also hope that other potential authors might begin now to plan suitable projects.

We are especially interested in high quality original research that aims to uncover previously unavailable data and re-evaluate treatments and practice in light of that new evidence. The ideal way to summarise the findings would be a formal meta-analysis, showing how the newly identified information affects the balance of benefit to harm. It is not necessary to conclude that full consideration of all of the evidence in fact changes practice—we will also be interested in papers that conclude that, even with new evidence, nothing should change.

Lost in the sometimes rancorous debate over research transparency, and the reasons for publication and non-publication, is the most important thing: efforts are needed to restore trust in existing evidence. To that end, the BMJ is more interested in constructive use of data than finger pointing or blame. We encourage drug companies and device manufacturers, as well as academic researchers, to take advantage of the opportunity afforded by our upcoming theme issue. Full information about previously conducted clinical trials involving drugs, devices, and other treatments is vital to clinical decision making.

It is time to demonstrate a shared commitment to set the record straight.

Notes

Cite this as: BMJ 2010;341:c5641

Footnotes

References