Intended for healthcare professionals

Education And Debate

Users' guide to detecting misleading claims in clinical research reports

BMJ 2004; 329 doi: https://doi.org/10.1136/bmj.329.7474.1093 (Published 04 November 2004) Cite this as: BMJ 2004;329:1093
  1. Victor M Montori, Assistant Professor1,
  2. Roman Jaeschke2, clinical professor of medicine,
  3. Holger J Schünemann, associate professor of medicine3,
  4. Mohit Bhandari, assistant professor of medicine3,
  5. Jan L Brozek, lecturer in medicine4,
  6. P J Devereaux, assistant professor of medicine3,
  7. Gordon H Guyatt (guyatt@mcmaster.ca), professor of medicine3
  1. 1 Department of Medicine, Mayo Clinic College of Medicine, Rochester MN, USA
  2. 2 Department of Medicine, McMaster University, Hamilton ON, Canada
  3. 3 Department of Clinical Epidemiology and Biostatistics, McMaster University
  4. 4 Polish Institute for Evidence Based Medicine, Jagiellonian University Medical School, Krakow, Poland
  1. Correspondence to: G H Guyatt
  • Accepted 15 September 2004

Plenty of advice is available to help readers identify studies with weak methods, but would you be able to identify misleading claims in a report of a well conducted study?

Introduction

Science is often not objective.1 Emotional investment in particular ideas and personal interest in academic success may lead investigators to overemphasise the importance of their findings and the quality of their work. Even more serious conflicts arise when for-profit organisations, including pharmaceutical companies, provide funds for research and consulting, conduct data management and analyses, and write reports on behalf of the investigators.

Although guides to help recognise methodological weaknesses that may introduce bias are now widely available,2 3 these criteria do not protect readers against misleading interpretations of methodologically sound studies. In this article, we present a guide that provides clinicians with tools to defend against biased inferences from research studies (box).

Guide to avoid being misled by biased presentation and interpretation of data

  1. Read only the Methods and Results sections; bypass the Discussion section

  2. Read the abstract reported in evidence based secondary publications

  3. Beware faulty comparators

  4. Beware composite endpoints

  5. Beware small treatment effects

  6. Beware subgroup analyses

Read methods and results only

The discussion section of research reports often offers inferences that differ from those a dispassionate reader would draw from the methods and results.4 The table gives details of two systematic reviews summarising a similar set of randomised trials assessing the effect of albumin for fluid resuscitation. The trials included in both reviews were small and methodologically weak, and their results are heterogeneous. Both the reviews provide point estimates suggesting that albumin may increase mortality and confidence intervals that include the possibility of a considerable increase in mortality. Nevertheless, one set of authors took a strong position that albumin is dangerous, the other that it is not. Their positions were consistent with the interests of funders of their reviews.5

View this table:

Comparison of two …

View Full Text

Log in

Log in through your institution

Subscribe

* For online subscription