Intended for healthcare professionals

Rapid response to:

Education And Debate

Analysis of quality of interventions in systematic reviews

BMJ 2005; 331 doi: https://doi.org/10.1136/bmj.331.7515.507 (Published 01 September 2005) Cite this as: BMJ 2005;331:507

Rapid Response:

Systematic reviews should consider quality of intervention beyond design, to include delivery and evaluation of mechanisms of action

The data presented by Herbert and Bø show that the conclusions of
systematic reviews of the literature can be very different according to
whether trials of interventions without quality assurance procedures are
excluded or not. As interventions become more complex (e.g. multifaceted
behavioural interventions rather than administration of a drug), the task
of ensuring quality becomes more complex. There are two aspects of
“quality” in this context. First, the extent to which the intervention is
likely to be effective, given its components. To evaluate this, the
specific components of the intervention should be described in sufficient
detail (see [1]), and the theoretical and empirical evidence for their
likely efficacy referenced. Second, the extent to which the intervention
is delivered in practice, i.e. is faithful to the intervention protocol
should be monitored and reported [2].

The importance of differentiating between these two key aspects of
quality is illustrated by a recent trial of an intervention delivered by
trained facilitators to increase physical activity in those at risk of
Type 2 diabetes [3]. The intervention was of high quality, as evidenced by
its theory and evidence based intervention components. Intervention
development was informed by a series of expert meetings, systematic
reviews, consultation with the target group, and extensive piloting. The
development phase of the intervention adhered to recommended practice for
developing and evaluating complex interventions [4]. The facilitators
received systematic training and supervised practice. However, careful
recording, coding and analysis of the intervention in practice showed that
different components of the intervention were delivered to different
extents, varying across session and facilitator [5,6]. Thus, both
intervention efficacy and fidelity should be assessed if the scientific
potential of intervention trials is to be realised. In complex
interventions the efficacy of individual intervention components working
together is often unknown. The scientific potential of intervention
studies includes answering questions about not only whether, but also why
interventions are effective [1]. The extent to which evaluations of
interventions are designed to answer questions about causal mechanisms can
be seen as a third key aspect of quality.

In summary, quality assessment goes beyond that described by Herbert
and Bø. First, it requires evaluation of the design of the intervention,
which requires precise description of the intervention components. Second,
it requires evaluation of the extent to which the intervention was
delivered in practice as described in the protocol. And third, it requires
measurement along the hypothesised causal pathway of change to learn more
about mechanisms of effects. We have developed a method to assess quality
of intervention delivery and theoretical explanation using this model
[5,6].

Systematic reviews should consider these three aspects of quality of
interventions evaluated in clinical trials. They should evaluate the
extent to which authors explicitly report details of the behaviour change
techniques, the degree to which the intervention was delivered as planned,
and the theoretical constructs to be targeted in the intervention. This
allows planned mediation analyses to test the hypothesised causal paths of
the intervention.

[1] Michie S and Abraham C. Interventions to change health
behaviours: evidence-based or evidence-inspired? Psychology and Health
2004;19:29-49.

[2] Bellg AJ, Borrelli B, Resnick B, Hecht J, Minicucci DS, Ory M.
Ogedegbe G, Orwig D, Ernst D, Czajkowski S for the Treatment Fidelity
Workgroup of the NIH Behavior Change Consortium. Enhancing treatment
fidelity in health behavior change studies: best practices and
recommendations from the Behavior Change Consortium. Health Psychology
2005;23:443-51.

[3] Williams K, Prevost AT, Griffin S, Hardeman W, Hollingworth W,
Spiegelhalter D, Sutton S, Ekelund U, Wareham N and Kinmonth AL. The
ProActive trial protocol - a randomised controlled trial of the efficacy
of a family-based, domiciliary intervention programme to increase physical
activity among individuals at high risk of diabetes [ISRCTN61323766]. BMC
Public Health 2004;4:48.

[4] Campbell M, Fitzpatrick R, Haines A, Kinmonth AL, Sandercock P,
Spiegelhalter D, and Tyrer P. Framework for design and evaluation of
complex interventions to improve health. British Medical Journal
2000;321:694-96.

[5] Hardeman W, Michie S, Prevost T, Fanshawe T, Kinmonth AL. Do
trained intervention facilitators use theory-based behaviour change
techniques? Results from the ProActive Fidelity Project. Psychology and
Health 2005;20 Suppl 1:107.

[6] Michie S, Hardeman W, Abraham C. Identifying effective
techniques: the example of physical activity. Psychology and Health
2005;20 Suppl 1:173-74.

Email: wendy.hardeman@phpc.cam.ac.uk

Competing interests:
None declared

Competing interests: No competing interests

09 September 2005
Wendy Hardeman
Senior Research Associate
Susan Michie, Professor of Clinical Health Psychology, Department of Psychology, UCL. Ann Louise Kinmonth, Professor of General Practice, Cambridge
General Practice and Primary Care Research Unit, University of Cambridge, Cambridge CB2 2SR, UK