Development and evaluation of complex interventions in health services research: case study of the Southampton heart integrated care project (SHIP)BMJ 1999; 318 doi: http://dx.doi.org/10.1136/bmj.318.7185.711 (Published 13 March 1999) Cite this as: BMJ 1999;318:711
- Fiona Bradley, lecturera,
- Rose Wiles, senior research fellowb,
- Ann-Louise Kinmonth (), professorc,
- David Mant, professord,
- Madeleine Gantley, senior lecturer for the SHIP Collaborative Groupe
- a Department of Community Health and General Practice, Trinity College, Dublin 2, Ireland
- b Health Research Unit, School of Occupational Therapy and Physiotherapy, University of Southampton, Southampton SO17 1BJ
- c General Practice and Primary Care Research Unit, Institute of Public Health, Cambridge CB2 2SR
- d Department of Primary Health Care, Oxford University, Institute of Health Sciences, Headington, Oxford OX3 7LF
- e Department of General Practice and Primary Care, St Bartholomew's and Royal London School of Medicine and Dentistry, Queen Mary and Westfield College, London E1 4NS
- Correspondence to: Professor Kinmonth
- Accepted 10 February 1999
The development and evaluation of complex interventions within randomised controlled designs is a challenging area in health services research. The process usually entails a pilot phase to confirm the feasibility and potential effectiveness of the design before embarking on large and costly trials. However, the focus is often more on the study design and measures than on the theoretical base and extent to which the intervention can be appropriately applied. In this article, we use a case study to describe an approach to pilot work that addresses this gap.
Interventions are often defined pragmatically and lack any clear theoretical basis, which limits generalisability
Implementation is rarely described, which limits understanding of why an intervention is or is not locally successful
Integration of qualitative methods within pilot trials can help interpret the quantitative result by clarifying process and testing theory
This approach defines three levels of understanding: the evidence and theory which inform the intervention, the tasks and processes involved in applying the theoretical principles, and people with whom, and context within which, the intervention is operationalised
A case study shows how this novel method of programme development and evaluation can be applied
Compared with drug trials or trials of surgical procedures, the design and development of a health service intervention is highly complex. In practice such interventions are often defined pragmatically, according to local circumstance, rather than building on any specific theoretical approach.1 Even if an approach or technology can be clearly grounded in theory and evidence, it must still be operationalised and evaluated among specific practitioners and patients. There is thus a tension between evaluation of complex interventions and generalisability of results. Randomised trials alone can not tell us why an intervention was or was not successful, or whether the theory and evidence informing the intervention were appropriate or needed revision.
To clarify this, we propose three levels for defining a complex intervention: the evidence and theory which inform the intervention, the tasks and processes involved in applying the theoretical principles, and the people with whom, and context within which, the intervention is operationalised.
The Southampton heart integrated care programme brought together the principal researchers from the family heart and OXCHECK studies—two trials of primary care interventions for risk reduction in cardiovascular disease. 2 3 Debate after publication of the trials highlighted the importance of focusing on the design and process of interventions as much as on the evaluation of outcomes, to understand the process of application of an intervention in a way that would allow success to be understood and replicated, and unsuccessful approaches to be abandoned.4
The comparatively disappointing results from primary preventive programmes for cardiovascular disease also moved the focus of research to secondary prevention programmes. The Southampton heart integrated care programme was such a programme. It was led by specialist nurses who coordinated and supported follow up care in general practice of patients who had a hospital diagnosis of myocardial infarction or angina.
Definition of levels
The table describes the three levels of intervention as applied to the Southampton heart integrated care programme. Level 1 summarises the theory and evidence underpinning, in this case, the choice of a target population, service provision, and management of behaviour change within the programme; it deals with the gap between evidence of efficacy and provision of treatments for people with established ischaemic heart disease.4-7 It also deals with best practice in enabling behaviour change among practitioners and patients, using guidelines and psychological models.19-22 Level 2 defines the essential tasks and processes required for operationalisation in these areas, at a generalisable level. Level 3 defines who would do what locally—elements which are specific to a local setting.
The pilot trial was designed to assess the impact of the programme on lifestyle and cardiovascular risk. The two arms of the trial compared the new approach with the usual care of patients with myocardial infarction or angina.8 Overall, 597 adult patients (from all 67 general practices in Southampton and south west Hampshire) with myocardial infarction or angina were randomised to intervention (33 practices) or control (34) groups. Follow up was 90% complete. The intervention increased follow up in general practice at 4 months and 1 year, and improved attendance for rehabilitation. No important difference was, however, observed between the intervention and control groups in any of the primary outcome measures of cardiovascular risk.8
Integration of a qualitative approach
The trial quantified the effect on contacts in primary care, but not the quality of those contacts. It did not allow interpretation of negative findings at the three levels of individual people, processes, or theory. To understand how the intervention was delivered, whether some of the elements seen as generalisable by the research team were particularly important or problematic, and the appropriateness of underlying theory, a qualitative approach is necessary.
The methods and results from the qualitative part of the programme have been reported in detail. 9 10 Twenty five patients from the intervention group and, where appropriate, their partners were interviewed in depth. Participants were selected on the basis of maximum variety sampling.11 Patients were interviewed twice: shortly after discharge from hospital and again around 3 months later. In addition, 22 practitioners (hospital and practice nurses) involved with the intervention were interviewed or participated in a focus group once. A semistructured schedule of topics was used to guide the interviews and focus groups. Interviews and focus groups were audiotaped, fully transcribed, and subject to thematic analysis to identify emergent themes. Analysis was based on the grounded theory approach, and preliminary analysis of early interviews was used to inform further data collection and analysis.12
Explanation of numerical results
Level 3: people and context
Early data from interviews were used to refine the programme. To optimise implementation of the approach, the findings were fed back to the development group 4 months into the 18 month intervention. Initial analyses concentrated on the experiences and understanding that patients, their partners, and providers had of the role of the liaison nurse, primary care team, and rehabilitation services. The analyses examined the way these individuals perceived the intervention as being implemented. This phase of the analysis showed that patients were confused about the nature of the rehabilitation programme, that progression through the system was often slower than practitioners led them to expect, and that conflicting messages about the need for follow up were sometimes provided by primary and secondary care.
These findings were used to optimise the intervention. One improvement involved each specialist liaison nurse recruiting patients from specific practices for which they were solely responsible, rather than from all practices as before. This change encouraged communication between liaison nurses and practice nurses and between practice nurses and patients.
Level 2: tasks and processes
The later in-depth phase of the qualitative analysis was concerned with the interviewees‘ experiences, perceptions, and understandings over time after myocardial infarction or angina was diagnosed. The purpose of this analysis was to understand in more detail the tasks and processes needed to successfully operationalise the intervention. This analysis identified issues that may clarify the negative findings of the programme. For example, the focus groups for practice nurses suggested that in order for them to effectively follow up patients with established heart disease, greater sophistication in specification of essential tasks and processes was required than we had anticipated (box).
Suggestions from focus groups with practice nurses, about changes to their role in follow up of patients with ischaemic heart disease
Status within the primary healthcare team must be developed
Training must address knowledge and skills of cardiac assessment, and drug use and adherence, as well as facilitating behaviour change in relation to lifestyle
Opportunity must be given for nurses to give continuity of care
Improved integration at the primary-secondary care interface needs to take place, with secondary care staff clearly recognising the role of the practice nurse
Level 1: testing theory
The second purpose of the in-depth analysis was to confirm or question the evidence and theory on which the intervention was based. At this level, the qualitative inquiry also identified potentially important insights. For example, the findings suggest that understanding of heart attack as an acute but short term event will inhibit the adoption of subsequent long term lifestyle change, and that the intervention failed to address this possibility. There were several reasons. Initially patients described their astonishment at surviving a heart attack, which they had previously understood to be a fatal event, and therefore defined their own event as necessarily mild. In the period immediately after the heart attack, the information provided to patients by practitioners apparently encouraged this view of heart attack as a self limited episode from which complete recovery was probable, with little reference to the continuing underlying disease processes. At this stage lifestyle change seemed to be understood by patients as being linked to recovery in the short term rather than a long term preventive measure.
At a later stage patients‘ understandings about heart attack were subject to change particularly in cases where experience of recovery did not reflect the information given. For example, information encouraged patients to believe that they would be able to have sex in 2–3 weeks, would be back to work in 6 weeks, and would be back to normal within 3 months, when this was often not the case.13 At this later stage, faith could be lost in “official” information from practitioners and evidence drawn instead from personal experience. Conflict between the two was associated with questioning the explanatory power of information from practitioners, and viewing the adoption of long term lifestyle change as action that would not guarantee protection from a further heart attack.
The programme thus found that patients‘ understandings of heart attack are closely linked to their attitudes to the potential of lifestyle change to keep them well. The failure of the intervention to acknowledge that the occurrence of heart attack, the severity of heart attack, and the natural history of recovery from heart attack cannot be accounted for entirely by lifestyle seemed to be a central feature in patients‘ understanding.9
It is increasingly recognised that using qualitative research methods can “reach the parts other methods cannot reach.”14 Qualitative methods can be used on their own as a preliminary to a quantitative study in order, for example, to establish meaningful wording for a questionnaire or to develop the elements of an intervention. 15 16 They can be used to explain findings after quantitative research has been completed.17 Quantitative and qualitative methods are also used in parallel with substantive trials. 18 23 24 In the pilot trial of the Southampton heart integrated care programme, we have moved a step further by explicitly integrating qualitative methods within a pilot trial design. We argue here that parallel application of qualitative methods in a pilot trial can contribute significantly and efficiently to both optimising and evaluating a new health services intervention.
The pilot randomised controlled trial tests a hypothesis formulated from existing theory and evidence before data collection, examines inputs (resource use) and outcome (effect size), and provides evidence of whether the approach is feasible within a specific locality and worth substantive evaluation. Complementary to this, the qualitative research is concerned with the perspectives of the people involved in delivering and receiving the intervention, and the context in which the data are produced. Analysis of these phenomena can provide information about the process of implementing an intervention and can lead to suggestions about logistical changes needed to optimise the completion of tasks and processes within a local context.
There is always a learning curve in applying new interventions, but it is usually hidden. Our approach makes it explicit and formalises it through systematic feedback, allowing both description and optimisation of the application of an intervention at local level. This begins to move intervention development in the direction of what industrialists call “evolutionary operations.”25 This process involves performing rolling analyses over time—integrating both quantitative and qualitative findings to systematically optimise a production process.
At the deeper level of analysis for emergent themes, qualitative research can help understanding of the process whereby particular outcomes come about. It can thus enable a more sophisticated definition of what needs doing, as was shown with the practice nurses.9 It can also examine and test the theoretical basis of an intervention and question or affirm the principles on which the tasks and processes have been based. In this case the qualitative analysis raised questions about the information on natural history that is currently presented to patients by practitioners. It may be that in an effort to minimise fears and anxieties, practitioners are inadvertently providing an over- optimistic view of the natural history of convalescence for some patients and minimising the chronic nature of the underlying disease. These theoretical ideas are open to future hypothesis testing.
In summary, an integrated quantitative and qualitative approach to developing and evaluating complex interventions in health service research is both efficient and generalisable. Within one pilot study it formalises the usually hidden learning curve of implementation and optimisation. It allows better judgment of transferability of potentially effective programmes to other settings for confirmatory trials. It helps interpret quantitative findings, and questions underlying theory and assumptions to better inform future hypotheses and intervention designs.
We thank the patients and staff of the participating general practices. The Southampton heart integrated care programme was coordinated by the Primary Medical Care Group, University of Southampton in collaboration with: the Institute for Health Research and Development and the Institute for Health Policy Studies, University of Southampton; General Practice and Primary Care Research Unit, University of Cambridge; Department of Medical Statistics and Evaluation, Imperial College School of Medicine, London; Medical Statistics Unit, London School of Hygiene and Tropical Medicine; Health Economics Research Group, Brunel University; Department of Psychology, University of St Andrews; Department of Cardiac Medicine, National Heart and Lung Institute; and Department of Community Health and General Practice, Trinity College, Dublin. Members of the Southampton Heart Integrated Care Programme Collaborative Group are DA Wood (steering group chair), FB (former medical coordinator), M Buxton, A Davies, K Done, K Enright, M Johnston, D Johnston, K Jolly (medical coordinator), A-LK, DM (principal investigator), S Sharp, H Smith, V Speller, S Thompson, D Waller, RW, and L Wright. Members of the parallel qualitative research group are A-LK (chair), M Blaxter, MG, J Robison, A Spackman, and RW.
Contributors FB led the writing group and was lead trial executive while she worked in Southampton. RW carried out and wrote up the qualitative research on which the paper depends. A-LK chaired the Qualitative Research Group and was responsible for articulating the three levels of definition of the programme; she will act as guarantor for the paper. DM chaired the trial group and MG contributed to the design and process of the qualitative research and to the idea of writing up the development and evaluation of the complex intervention. All the authors have contributed to the drafting and critical review of the paper on behalf of the Southampton Heart Integrated Care Programme Collaborative Group.
Funding Research and development national programme grant from the National Health Service Executive, with service support from Southampton and South West Hampshire Health Authority. RW was in receipt of a NHS South and West Region research and development research training fellowship and FB was in receipt of a European Union research training fellowship. Subsequent development of the ideas in this paper were not externally funded.
Competing interests None declared.