Intended for healthcare professionals

CCBYNC Open access

Rapid response to:

Research

Electronic health record alerts for acute kidney injury: multicenter, randomized clinical trial

BMJ 2021; 372 doi: https://doi.org/10.1136/bmj.m4786 (Published 18 January 2021) Cite this as: BMJ 2021;372:m4786

Opinion

The challenge of minimal risk in e-alert trials

Rapid Response:

The importance of context-specific clinical decision support design

Dear Editor:

In this patient-level randomized, six hospital trial, Wilson et al[1] report the results of a hard-stop alert that was repeatedly delivered to all clinicians on the patient’s care team each time they opened the patient’s chart. Alerts were accompanied by a static AKI care bundle and requirement to add AKI to the patient’s problem list. The alert could be suppressed for 48 hours if the clinician acknowledged that they agreed or disagreed with the diagnosis of AKI. The authors found that the intervention drove “relatively small” changes in clinician behavior, “had no effect on the risk of progression of AKI, dialysis, or death”, and was “associated with a significantly higher risk of death at 14 days in the non-teaching hospitals”. Given these results, the intervention’s effectiveness may have been limited by: (1) a false assumption that clinicians were unaware of the patient’s AKI, (2) fatigue due to the alert’s interruptive, hard stop nature, and (3) lack of synchrony between CDS delivery and decision-making ‘in the wild’ without considering the realities (or competing demands) of clinical care.

The authors acknowledge that electronic alerts are implemented under “the assumption that increased recognition of AKI will improve care of these patients.” However, not following evidence-based best practices can also be due to non-acceptance of evidence, real or perceived lack of applicability to specific scenarios, limited ability to act on knowledge, or competing demands in the high-paced clinical work environment.[2] CDS for any condition should be informed by a thorough investigation of the specific knowledge-to-practice pipeline, taking into account the realities of the clinical work environment, and aim support where potential risk points are identified.

Though the number of times the alert was fired and/or dismissed without action was not reported, the delivery of the intervention likely resulted in a high volume of hard-stop alerts that could have distracted clinicians, repeatedly increasing their cognitive load on patients in the intervention arm, thereby potentially leading to unintended consequences including medical errors unrelated to AKI.[3] Bisantz and Wears warn that hard stop alerts should be implemented very carefully and used only when appropriate (i.e. to prevent immediately dangerous actions or to help with a very clear, evidence-based implementation after clinicians buy-in using a participatory design approach) since “they are fundamentally blunt instruments, limiting behavior to a single course of action, one that has been predetermined by system designers a priori, without regard to the particular context, goals, or actions active at the time.”[4] Although the Supplement mentions that clinician focus groups elicited feedback on some of the intervention’s design elements, the intervention’s design suggests that these focus groups did not provide an in-depth sociotechnical work system analysis to adequately inform the design of a solution based on inpatient clinicians’ needs, workflow, and clinical context.

Beyond integration within the EHR, a 2005 systematic review of 70 studies found three features of CDS systems to be predictive of improved clinical practice: provision of CDS as a part of clinician workflow, provision of recommendations rather than assessments, and provision at the time and location of decision-making.[5] While Wilson et. al. should be lauded for their provision of evidence-based CDS for AKI, provision of a static care bundle within a pop-up alert that was temporally unrelated to decision-making or workflow may have limited its effectiveness and contributed to alert fatigue. Potential benefits of CDS for many patients may have also been lost through suppression of pop-up alerts by busy clinicians. Indeed, a 2020 meta-analysis of 122 CDS clinical trials reported that most CDS interventions only “achieve small to moderate improvements in targeted processes of care” (5.8% 95% CI 4.0-7.6%).[6] For CDS to meet its full potential to improve care, it must be designed meets users’ needs with a careful understanding of the work context by delivering the right information to the right user in the right format at the right time in their workflow.[7] The future of CDS is bright if it does.

References
1 Wilson FP, Martin M, Yamamoto Y, et al. Electronic health record alerts for acute kidney injury: multicenter, randomized clinical trial. BMJ 2021;372:m4786.
2 Glasziou P. The paths from research to improved health outcomes. Evidence-Based Nursing. 2005;8:36–8. doi:10.1136/ebn.8.2.36
3 Powers EM, Shiffman RN, Melnick ER, et al. Efficacy and unintended consequences of hard-stop alerts in electronic health record systems: a systematic review. J Am Med Inform Assoc. 2018;25:1556–66.
4 Bisantz AM, Wears RL. Forcing functions: the need for restraint. Ann Emerg Med. 2009;53:477–9.
5 Kawamoto K, Houlihan CA, Balas EA, et al. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330:765.
6 Kwan JL, Lo L, Ferguson J, et al. Computerised clinical decision support systems and absolute improvements in care: meta-analysis of controlled clinical trials. BMJ. 2020;370:m3216.
7 Campbell RJ. The five rights of clinical decision support: CDS tools helpful for meeting meaningful use. JAHIMA 2013;84:42–7 (web version updated February 2016).

Competing interests: No competing interests

22 February 2021
Edward R Melnick
associate professor
Ayse P. Gurses, professor, Schools of Medicine, Bloomberg Public Health and Whiting Engineering, director, Armstrong Institute Center for Health Care Human Factors, Johns Hopkins University; Scott Levin, associate professor, Department of Emergency Medicine, Johns Hopkins University; Jeremiah S. Hinson, assistant professor, Department of Emergency Medicine, Johns Hopkins University School of Medicine
Department of Emergency Medicine, Yale School of Medicine
464 Congress Ave, Suite 260, New Haven, CT 06519