Intended for healthcare professionals

CCBYNC Open access
Analysis Quality Improvement

Different approaches to making and testing change in healthcare

BMJ 2021; 374 doi: https://doi.org/10.1136/bmj.n1010 (Published 17 August 2021) Cite this as: BMJ 2021;374:n1010

Read the full collection

  1. Greg Ogrinc, senior vice president1 2,
  2. Mary Dolansky, associate professor3 4 5,
  3. Amy J Berman, senior programme officer6,
  4. David A Chambers, deputy director for implementation science 7,
  5. Louise Davies, associate professor8 9
  1. 1Certification Standards and Programs, American Board of Medical Specialties, Chicago, USA
  2. 2University of Illinois Chicago College of Medicine, Chicago, IL, USA
  3. 3Frances Payne Bolton School of Nursing, Cleveland, OH, USA
  4. 4QSEN Institute Case Western Reserve University, Cleveland, OH, USA
  5. 5Veterans Affairs Quality Scholars Program, Cleveland, OH, USA
  6. 6John A Hartford Foundation, New York, NY, USA
  7. 7Division of Cancer Control and Population Sciences, National Cancer Institute, NIH, Rockville, MD, USA
  8. 8VA Outcomes Group, Department of Veterans Affairs, White River Junction, VT, USA
  9. 9Dartmouth Institute for Health Policy and Clinical Practice, Hanover, NH, USA
  1. Correspondence to: G Ogrinc gogrinc{at}abms.org

Greg Ogrinc and colleagues call for greater exploitation of the synergies between quality improvement and implementation science in improving care

Improving the quality of healthcare is complex. It requires input not just from healthcare providers but also from patients and families to identify gaps, develop meaningful interventions, and ensure that interventions improve care and outcomes, and consider value from their perspective.1 Closing gaps in healthcare quality, improving workflows, and implementing evidence based interventions require change, but not all changes are successful, and most come with unintended consequences.

Numerous approaches are available to making changes in healthcare systems such as lean, six sigma, the model for improvement, healthcare delivery science, and implementation science. These are usually used in isolation, although there is some overlap in their approaches, particularly quality improvement and implementation science. When looking to build and disseminate knowledge about making change, collaboration between approaches might help create changes more successfully and efficiently.

Interdisciplinary tensions

Over the past several years we have recognised a tension and, at times, a competition between quality improvement (QI) and implementation science (IS), two commonly used systematic approaches to improve the quality, safety, and value of healthcare services and to disseminate what is learnt from those efforts. This tension is unnecessary and wasteful when so many gaps in healthcare quality need to be addressed. QI focuses on the highly relevant work within a particular context while IS focuses on framing the work to make findings generalisable. Both approaches are important, and there is considerable overlap from which each can learn. Failure to recognise the overlap can lead to replication of interventions, delay in the dissemination of effective interventions, and missed opportunities to work together to improve healthcare. Others have also recently noticed overlap in QI and IS such as the potential to use both to improve cancer care.2

The response of healthcare systems to the covid-19 pandemic exemplifies this challenge. Early in the pandemic, local systems experienced a rapid influx of patients with covid-19 and dwindling supplies of personal protective equipment (PPE). Institutions relied on sound QI methods to determine how to solve the particular problem of PPE in their particular context. While this was necessary and helpful, there was, perhaps, a missed opportunity. If IS methods had been used to help solve those problems, it might have been easier to share answers with others. Both QI and IS bring a rich base of knowledge and skills. Both are needed but their potential summative effects have been underused during the covid-19 pandemic, partly because these fields see themselves as competitors and not collaborators.

QI and IS approach change from different philosophical underpinnings, yet we feel they share similarities that suggest combining their lenses would be beneficial. While QI comes from system operations3 and IS from behavioural sciences,4 both recognise that changes occur in a specific context and are affected by the context itself, requiring that each context be considered unique. The outcomes of interest in QI are generally improving quality of care: safety, timeliness, effectiveness, equity, efficiency, and patient centredness.5 The outcomes of interest in IS generally include the uptake and application of evidence based care with attention to acceptability, cost, and feasibility.6 Both fields focus on disseminating findings to others through peer reviewed publications.78 Overall, this tension has been accurately described as the work of moving evidence into practice.9

Different approaches to change

Making and documenting changes to interventions is difficult, and reporting of QI and IS varies, limiting impact.10 The Standards for Quality Improvement Reporting Excellence (SQUIRE) were developed to improve the reliability of reporting among those using QI, IS, or any other of the approaches to improvement.7

The challenge of reporting is prominently manifest in how the intervention changes in response to the local context. An intervention has an initial structure, but the intervention is typically modified throughout the process of improvement or implementation to make it more effective in a particular context. Both fields use imperfect approaches to capturing the data related to these processes, and each has something to offer the other.

In QI, changes are widely promoted to be accomplished through “tests of change” to predict, test, and assess the effect in the local microsystem. Careful use of tests of change, such as through plan-do-study-act (PDSA) cycles, are viewed as key to learning about the microsystem and the context to inform the change process.11 The strength of this approach is the importance placed on the microsystem and context, and in this way, it may inform the IS field, which does not strongly emphasise the use and reporting of recursive change.

In IS the concept for modifying the intervention is referred to as “adaptation” and is recognised as key to identifying how to spread effective interventions to new contexts, rather than on making the change work in one specific context. Making, assessing, and reporting these adaptations are viewed as an essential part of generating knowledge that can be readily shared with others.12 The strength of IS is in the methods and approaches to the assessment and reporting of adaptations, and these could be usefully applied to QI, which does not emphasise spread to other contexts as strongly.78

PDSA versus adaptation

PDSA cycles are often used in QI for tests of change in a system. PDSA changes should be small, focused, and deliberate.1113 The goal is to try an intervention in a microsystem to learn about how the microsystem reacts. Sometimes the test of the intervention is successful and the system moves closer to its goal. Other times, the PDSA change may not be successful, and the team learns how the microsystem absorbs or ignores the intervention. Making successive changes in a system is often messy and complex. Key to successful PDSA projects is collecting data that clearly align with the goal and analysing each PDSA cycle.11 Healthcare improvement teams may be frustrated when a step in a PDSA is unsuccessful or they may worry that testing a “small” change will not lead to the improvement that they seek. But although the overall objective is improvement towards the goal, PDSA is about learning and gaining insight into the system, from both the successes and the failures.13

In contrast, IS develops an initial, detailed plan for the implementation process. IS considers the contextual factors in the development of the initial implementation strategy to determine which may be facilitators and which may be barriers to the intervention.1415 Identifying these factors in advance allows the team to create implementation strategies that address the anticipated barriers; however, the best design and plans will always be confronted with a changing context. IS often addresses this by adapting the intervention. In IS, there may be tension between fidelity to the planned implementation strategy and adaptation. Adaptation, in some ways like PDSA cycles, plays a central role in the “fit” between the context and the intervention.

Fidelity in QI and IS

Understanding the fidelity of the intervention in both PDSA and adaptation is vital to successful execution of change. Haphazard execution is a risk during iterations and limits learning.16 Fidelity in IS is the extent to which an intervention adheres to the planned protocol. The intervention may adapt, but the core elements are intended to be implemented as initially designed. In QI, interventions are expected to be modified through each PDSA cycle of change as the team gains insight into what works, for whom, and in what context. In QI, fidelity refers to both the adherence to the planned protocol within each PDSA cycle and to the faithful use of data to inform the next test of change.17 This ensures that changes are driven by the findings of the previous iteration and lead to the accumulation of insights about the intervention, thus increasing the possibility that the intervention will be sustained within the system.

Context and change

The context changes in complex systems in response to alterations in processes, people, policy, or any other perturbation. Context and healthcare improvement interventions interact, so it is important to account for how the context and the intervention change over time to make the changes more sustainable. In QI, context is defined as the physical and sociocultural makeup of the local environment and the interpretation of these factors by the stakeholders in the environment.7 It is considered fluid and consists of specific factors but also individuals’ interpretation of the relationships between the factors.18

Context in QI may be identified and assessed through process analysis tools such as process flow, cause-effect, or workflow diagrams. These tools are used in the design of interventions to provide insights into the context, enabling the initial intervention to be tailored to the specific microsystem. The intervention is then modified through successive PDSA cycles, based on how the local context reacts to the intervention.

In IS, models such as the consolidated framework for implementation research (CFIR), provide a framework of domains of context.19 These help inform the design of interventions and are especially useful for formative evaluations of change. CFIR domains, for example, are prespecified contextual factors and sub-constructs that include adaptability and trialability of the intervention. CFIR also recognises that PDSAs are one way to adapt the intervention to a specific context.19 Importantly, both QI and IS note that there is bidirectional interaction, with the intervention affecting the context and the context affecting the intervention.

Learning from one another

The different origins of healthcare QI (systems operations3) and IS (behavioural change4) sometimes obscure their common goal of creating improvements in the quality, safety, and value of healthcare services in partnership with patients, families, and communities. Each field has much to offer the other in the work of initiating and evaluating change. Figure 1 lists the main characteristics of intervention modification in QI and IS, showing the substantial overlap between the intent and execution of change within each field. While each has a specific approach, there is much in common and much that can be learnt through collaborations. QI can learn from IS by incorporating framework driven approaches to development, planning, and evaluating outcomes that are helpful to make the work more easily generalisable. IS can learn from QI by incorporating data driven flexibility, needed to show how interventions can be successful in a wide variety of contexts.

Fig 1
Fig 1

Characteristics and overlap of tests of change in quality improvement (QI) and implementation science (IS)

We see many opportunities for these fields to work together. Organisations that focus on QI should bring in IS strategies, specifically to learn what to measure to spread work to other microsystems and contexts. IS should expand the integration of QI within its work, perhaps by explicitly embedding PDSA cycles within IS sustainability efforts.

Funding organisations and journals have long recognised IS because of its similarity to the research model. Funders and healthcare journals should encourage use of QI and IS methods together. This would expand the study of both and speed the growth of knowledge about how to make and sustain change in healthcare. Neither of these fields has led to the healthcare system transformation that was hoped for, and the emerging improvers and researchers in health professions should be steeped in both. Perhaps this will lead to the emergence of a new set of knowledge and methods that will have a more lasting effect. By working together and combining knowledge and methods from both fields, QI and IS can develop a unified approach with more depth and effectiveness than either field has on its own.

Key messages

  • Implementation science and quality improvement both use tests of change to adapt interventions in a particular context

  • The context in which the changes are made affects the effectiveness of tests of change, irrespective of the methods that are used

  • Fidelity in tests of change refers to both the faithful use of data to drive the iterations of the interventions as well as implementation of the intervention as designed

  • The common features of implementation science and quality improvement can be used to improve the conduct and reporting of changes to interventions

Footnotes

  • Contributors and sources: GO, MD, and LD have worked on developing methods and publishing guidelines for quality improvement. AJB has extensive knowledge and experience as a patient and working to improve care for patients, families, and communities. Her work focuses on facilitating the development of innovative delivery methods for older adults. DAC is internationally known for his leadership of implementation science.

  • Patient and public involvement: AJB represented the patients, families, and communities who are served by healthcare.

  • Competing interests: We have read and understood BMJ policy on declaration of interests and have no relevant interests to declare. The views expressed are those of the authors and do not represent the official position of the National Cancer Institute or the US Department of Veterans Affairs.

  • Provenance and peer review: Commissioned; externally peer reviewed.

  • This article is part of a series commissioned by The BMJ based on ideas generated by a joint editorial group with members from the Health Foundation and The BMJ, including a patient/carer. The BMJ retained full editorial control over external peer review, editing, and publication. Open access fees and The BMJ’s quality improvement editor post are funded by the Health Foundation.

http://creativecommons.org/licenses/by-nc/4.0/

This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

References