Intended for healthcare professionals

Papers

# Randomised trial of educational visits to enhance use of systematic reviews in 25 obstetric units

BMJ 1998; 317 (Published 17 October 1998) Cite this as: BMJ 1998;317:1041
1. Jeremy C Wyatt (jeremy.wyatt{at}ucl.ac.uk), senior research fellowa,
2. Sarah Paterson-Brown, consultant obstetricianb,
3. Richard Johanson, senior lecturer in obstetricsc,
6. Nicholas M Fisk, professorb
1. aImperial Cancer Research Fund Medical Statistics Group, Centre for Statistics in Medicine, Institute of Health Sciences, Headington, Oxford OX3 7LF,
2. bInstitute for Obstetrics and Gynaecology, Imperial College School of Medicine, Queen Charlotte's and Chelsea Hospital, London W6 0XG
3. cAcademic Department of Obstetrics and Gynaecology, City General Hospital, Stoke on Trent ST4 6QG
1. Correspondence to: Dr J C Wyatt, Health Knowledge Management Programme, School of Public Policy, University College London, London WC1E 7HN
• Accepted 10 July 1998

## Abstract

Objective To evaluate the effectiveness of an educational visit to help obstetricians and midwives select and use evidence from a Cochrane database containing 600 systematic reviews.

Design Randomised single blind controlled trial with obstetric units allocated to an educational visit or control group.

Setting 25 of the 26 district general obstetric units in two former NHS regions.

Subjects The senior obstetrician and midwife from each intervention unit participated in educational visits. Clinical practices of all staff were assessed in 4508 pregnancies.

Intervention Single informal educational visit by a respected obstetrician including discussion of evidence based obstetrics, guidance on implementation, and donation of Cochrane database and other materials.

Main outcome measures Rates of perineal suturing with polyglycolic acid, ventouse delivery, prophylactic antibiotics in caesarean section, and steroids in preterm delivery, before and 9 months after visits, and concordance of guidelines with review evidence for same marker practices before and after visits.

## Discussion

Our data show encouraging trends in the number of obstetric units practising according to the evidence contained in the Cochrane module on pregnancy and childbirth. During the study period ventouse usage increased significantly more in intervention units than in control units, but there was no difference in the use of steroids for preterm delivery in either control units or intervention units. Increases in use of polyglycolic acid sutures and use of antibiotics in caesarean section over the study period were similar in control units and intervention units, with no difference attributable to the visit. Educational visits were associated with a significantly higher uptake of Cochrane review evidence relevant to only one of the four clinical practices studied.

Taking a conservative view, the educational visit led to only a modest difference between control units and intervention units in the extent to which clinical practice was based on evidence, and no change in the extent to which practice guidelines were based on evidence. This is similar to the results of a more intensive social marketing intervention to Canadian midwives by Hodnett and colleagues, though these investigators attributed their failure to excluding obstetricians.24

### Internal and external validity

Our study was a rigorous randomised trial of educational visits as a method for helping obstetricians and midwives identify and implement evidence from the Cochrane module on pregnancy and childbirth in their units. We recruited, randomised, and followed up 25 of the 26 district general obstetric units in two former NHS regions. Four representative, common clinical practices linked closely to patient outcomes were assessed. The assessor was blind to unit allocation. A Hawthorne effect is unlikely as clinical practice data were obtained from notes of patients who gave birth at least 1 month before data collection, and we informed obstetric units that we were conducting a regionally coordinated audit only a fortnight beforehand. While we did not directly measure patient outcomes, we chose our four marker practices because they have been shown in systematic reviews to improve major outcomes, so were valid surrogates. 10 29

### Study limitations

There are several possible explanations for our failure to show much effect of the educational visits. It is possible that longer or repeated visits might work better, but these would be harder to deploy on a national scale. Single visits have worked well in the past for influencing individual clinical practices, and usually do so quite rapidly.21 Contamination between intervention units and control units is unlikely as randomisation was by hospital, rather than by patient or professional. We did not ask obstetric unit staff to nominate colleagues most influential to their education, as has been done in other studies,20 but instead targeted the obstetrician most involved with the labour ward, and the midwifery manager. These individuals may not be as academically influential as university based opinion leaders,20 but they are constantly on hand, and they are in a stronger position to identify and remove barriers to evidence based practice in their own unit than outsiders.

At baseline in 1994, two of the four marker practices' procedures were already being carried out according to Cochrane review evidence in more than half of the deliveries studied (table). This shows encouraging progress between 1990-928 and 1994, and achieving further increases may have been difficult because of a ceiling effect.30 The wide variation in clinical practices between units and the marked chance differences in baseline rates for use of polyglycolic acid sutures and ventouse between control units and intervention units, made follow up rates hard to interpret (table).

In 1993 we found that the Cochrane module on pregnancy and childbirth was available in only 16% of UK district hospital obstetric units,14 but by 1994 it was available in six of our 13 (48%) control units and by 1995 in 10 (77%) of the units, suggesting that passive diffusion of the concept of evidence based medicine was taking place.31 During the study period there were several regional and national initiatives to enhance the uptake of clinical evidence, such as the Royal College of Obstetricians and Gynaecologists' Minimum Standards of Care in Labour, which may have facilitated this.32 It is possible that our data collection at follow up occurred too early in the process of innovation31 to observe changes in clinical practice, since other changes may need to precede this, such as enhanced awareness of the role of evidence and changed unit policy. However, we did not find that written unit guidelines were more evidence based in intervention units than in control units.

### Study implications

Important lessons from our study for others conducting such implementation research are firstly, that it is hard to improve on baseline rates for clinical practices of 60% or 80%. By analogy with clinical practice, where specific treatment is given only to diseased patients, our support should be focused where it is most needed. Secondly, the heterogeneity of clinical practice (rates for all practices varied between units from 0 to 100%) or the passive diffusion of innovation in control units during the study period must not be underestimated. Finally, data we shall report elsewhere show very large mismatches between the policies claimed by unit staff, their written guidelines, and actual clinical practice, emphasising the need to measure actual clinical practice and not rely on clinicians' statements or written guidelines.33

Educational visits are clearly one effective way of implementing change in clinical practice, based mainly on studies showing improved prescribing—for example of antibiotics and other drugs. 21 22 34 Our study is the first randomised trial examining how to enhance the uptake of Cochrane review evidence using educational visits. Our failure to show much effect on obstetric practice does not contradict other studies of educational visits, as different mechanisms may be active when trying to change the basis on which a medical specialty rests. However, our study shows that educational visits, even to senior staff in two disciplines, are insufficient to convert an obstetric unit to evidence based practice, and that other techniques will be needed to enhance the impact of Cochrane reviews. It would be unfortunate if health services in the United Kingdom and elsewhere invested in educational visits to spread the ideas of evidence based medicine without further rigorous evaluation.

## Acknowledgments

We thank Iain Chalmers, Mark Starr, and David Spiegelhalter for support, Liz Byford and Val Dineen for data collection, and all clinicians who collaborated.

Contributors: JCW originated the study, assisted in obtaining funding, designed the study, analysed the raw data, and drafted and revised the paper; he will act as guarantor for the paper. SP-B and NMF assisted in obtaining funding, refined the study design and paper drafts, assessed guideline quality, and supervised data collection. RJ produced the video, carried out the visits, and contributed to study design and paper drafts. DGA and MJB advised on study design, carried out statistical analyses, and contributed to paper drafts.

## Footnotes

• Funding This study was funded by regional research implementation initiatives of the North Thames and South Thames regional health authorities; the Imperial Cancer Research Fund; and North Staffordshire Hospital Trust.

• Conflict of interest None.

View Abstract