Intended for healthcare professionals

Rapid response to:

Editorials

Qualitative research and The BMJ

BMJ 2016; 352 doi: https://doi.org/10.1136/bmj.i641 (Published 10 February 2016) Cite this as: BMJ 2016;352:i641

Rapid Response:

Re: Qualitative research and The BMJ

Loder and colleagues’ reply fails to address one of the main points raised by Greenhalgh et al’s letter: their editorial policy implicitly constrains the topics about which research papers can get into the BMJ, excluding research about complex interventions. For such interventions, as Woolcock (1) warns, ‘the default assumption regarding their external validity... should be zero’; to learn anything about complex interventions which can validly be applied elsewhere, case studies which use qualitative or mixed methods are likely to be more useful than RCTs.

Many of the quantitative papers that do get published in the BMJ build their claims to generalisability on sand, by failing to acknowledge that the apparently-simple interventions they ‘test’ do not work alone but as components of a complex system, particularly when they are implemented in primary care. If these papers do indeed change clinical practice, as the editors hope, they may well do so by misleading doctors into assuming that what ‘works somewhere’ will also ‘work here’ (2).

1. Woolcock M. Using case studies to explore the external validity of ‘complex’ development interventions. Evaluation. 2013;19:229-48.
2. Cartwright N, Hardie J. Evidence-Based Policy: A Practical Guide to Doing It Better. Oxford, UK: Oxford Scholarship Online; 2012 February 2015 online.

Competing interests: No competing interests

27 February 2016
Louisa Polak
GP and PhD student
Colchester