Original Article
An evidence-based practice guideline for the peer review of electronic search strategies

https://doi.org/10.1016/j.jclinepi.2008.10.012Get rights and content

Abstract

Objective

Complex and highly sensitive electronic literature search strategies are required for systematic reviews; however, no guidelines exist for their peer review. Poor searches may fail to identify existing evidence because of inadequate recall (sensitivity) or increase the resource requirements of reviews as a result of inadequate precision. Our objective was to create an annotated checklist for electronic search strategy peer review.

Study Design

A systematic review of the library and information retrieval literature for important elements in electronic search strategies was conducted, along with a survey of individuals experienced in systematic review searching.

Results

Six elements with a strong consensus as to their importance in peer review were accurate translation of the research question into search concepts, correct choice of Boolean operators and of line numbers, adequate translation of the search strategy for each database, inclusion of relevant subject headings, and absence of spelling errors. Seven additional elements had partial support and are included in this guideline.

Conclusion

This evidence-based guideline facilitates the improvement of search quality through peer review, and thus the improvement in quality of systematic reviews. It is relevant for librarians/information specialists, journal editors, developers of knowledge translation tools, research organizations, and funding bodies.

Introduction

The quality of systematic reviews and health technology assessments depends on many factors. A key factor is the evidence base, that is the literature and other information on which the analysis is based. A sound evidence base is equally important in related contexts, such as clinical practice guidelines, formal decision analysis, and most health economics studies, and the knowledge translation efforts surrounding this work. Performing a high-quality electronic search of information resources is an essential contribution toward ensuring accuracy and completeness of the evidence base used in these reports. Furthermore, search quality has resource implications for the conduct of a review, because the number of records retrieved and screened is often very large. The aim of this research was to develop an evidence-based guideline in the form of an annotated checklist for peer review of electronic search strategies. Given the inconsistent performance of the peer-review process in the contexts of journal publication [1] and grant funding [2], any new development in peer review should be evidence-based.

Checklists, scales, and instruments (henceforth, instruments) for validating some aspects of the search-reporting methods of the systematic review process have been developed, and some address aspects of the overall search plan. We have reviewed these elsewhere [3]. However, a validated process for evaluating the quality and completeness of the electronic search strategies used to identify the largest portion of the evidence base for systematic reviews, and related evidence-based products, such as health technology assessments and clinical practice guidelines, does not exist. The lack of such a process paired with a demonstrable level of error in reported searches [4] leaves this type of research open to debate over the quality of evidence on which they are based.

Given the absence of any adequate instruments for peer-reviewing electronic search strategies used for systematic reviews, a context where high recall must be balanced with reasonable precision, we sought evidence regarding various “elements” of the electronic search strategy. We define elements as components of the electronic search strategy that could have a positive or negative impact on the performance metrics of recall and precision. Research evidence on the impact of problems with these elements on search performance would be the most compelling evidence. We expected some gaps in the evidence and so we considered reports from the literature on the prevalence of errors of that type, as well as theoretical explanations of their impact. We sought this evidence through a systematic review of the literature. Finally, we sought expert opinion through a survey.

Section snippets

Objectives

We had three objectives: the identification of elements associated with the accuracy and completeness of the systematic review evidence base, identified by electronic search strategies; the identification of the impact of problems with any of these elements on the resulting evidence base; and the development of an evidence-based guideline for peer review of the electronic search strategy.

Systematic review methods

A systematic review of the literature was conducted to address (1) whether instruments exist that evaluate or validate the quality of literature searches in any discipline and (2) which elements of electronic search strategies have demonstrable impact on search performance. These articles need to report performance indicators or measures (such as recall, relevance, and so forth).

The search strategy was developed initially in the Library and Information Science Abstracts (LISA) database by one

Systematic review

Our electronic database search strategies identified 9,155 records. After broad screening of the titles, abstracts, and keywords, 256 appeared potentially eligible and were obtained for data abstraction. Of these, 113 articles were assessed as eligible for some aspect of the systematic review process (Fig. 1). The remaining articles were excluded either because they did not meet the inclusion criteria (n = 139) or because they could not be obtained (n = 4).

Existing scales and instruments

Twenty-six articles discussed aspects that

Conceptualization

Recommendation

  1. Assess whether the research question has been correctly translated into search concepts.

Guidance
  1. Review the conceptual elements of the search before considering any of the mechanical elements.

  2. Consider that the combination of too many search concepts reduces recall.

The search strategy will ideally be submitted for peer review accompanied by the research question structured according to a format, such as Population, Intervention, Comparison, Outcome (PICO) [6] and an explanation of how the reference

Discussion

Evidence-based peer review of electronic search strategies needs to focus on those elements that will negatively impact search performance. A preliminary list of elements was explored through a systematic review of the literature. Several additional elements, with either some supporting research evidence or support from the survey, were identified, and some initial elements were refined. After seeking opinion from the community of interest, the list was reduced to those elements with at least

Conclusions

The development of this peer review guideline fills a gap in the quality assurance of search methodology in systematic reviews and health technology assessment reports. Errors in electronic search strategies have been demonstrated to reduce the effectiveness of electronic search strategies used in systematic reviews and health technology assessment reports. This evidence-based guideline is based on a rigorous methodology, and it focuses only on those elements of the electronic search where

Acknowledgments

We wish to acknowledge and thank Janet Joyce, who served as an advisor to this project, Raymond Daniel for technical assistance, Tamara Rader for writing assistance, Nick Barrowman and Isabelle Gaboury for statistical assistance, and all of those who participated anonymously in the survey. This research was supported by a capacity building grant from the Canadian Agency for Drugs and Technologies in Health. David Moher is supported by a University of Ottawa Research Chair.

References (10)

  • M. Sampson et al.

    No consensus exists on search reporting methods for systematic reviews

    J Clin Epidemiol

    (2008)
  • T. Jefferson et al.

    Editorial peer review for improving the quality of reports of biomedical studies MR 000016

    Cochrane Database Syst Rev

    (2007)
  • V. Demicheli et al.

    Peer review for improving the quality of grant applications

    MR000003. Cochrane Database Syst Rev

    (2007)
  • M. Sampson et al.

    Errors in search strategies were identified by type and frequency

    J Clin Epidemiol

    (2006)
  • M. Sampson et al.

    PRESS: peer review of electronic search strategies

    CADTH Technical Report

    (2008)
There are more references available in the full text version of this article.

Cited by (0)

View full text