Intended for healthcare professionals


Is evidence based medicine neglected by royal college examinations? A descriptive study of their syllabuses

BMJ 2000; 321 doi: (Published 09 September 2000) Cite this as: BMJ 2000;321:603
  1. Wai-Ching Leung (W.C.Leung{at}, senior registrar in public health medicinea,
  2. Paula Whitty, senior lecturer in medical care epidemiologyb
  1. a Epidemiology and Public Health, Newcastle City Health Trust, Newcastle upon Tyne NE4 6BE
  2. b Department of Epidemiology and Public Health, University of Newcastle upon Tyne, Newcastle upon Tyne NE2 4HH
  1. Correspondence to: W-C Leung
  • Accepted 4 May 2000

Although the value of evidence based medicine has been debated,1 the benefits of teaching it to undergraduates2 and postgraduates3 have been shown and have been acknowledged in the development of the new undergraduate medical curriculum.4 However, lack of adequate training is a major obstacle to postgraduates.3

Evidence based medicine has four distinct steps5: formulate clear clinical questions from a patient's problem; search the literature for relevant articles; evaluate the evidence for its validity and usefulness; apply useful findings in clinical practice.

This report examines the level of skills in evidence based medicine that are formally assessed by the royal colleges in the United Kingdom, through a review of the colleges' syllabuses for postgraduate examinations that are compulsory for specialist training.

Methods and results

We reviewed all syllabuses in effect on 20 October 1999 that were held by 16 faculties or royal colleges in the United Kingdom, representing 15 major specialties. Surgical examinations (held by three royal colleges) were reviewed separately. Subspecialties in surgery and pathology were grouped as a single specialty. Radiology and oncology, which are both examined by the Royal College of Radiologists, were analysed separately.

We reviewed each syllabus to determine whether the skills required for the four steps in evidence based medicine were assessed. The authors initially reviewed the syllabuses independently, and any disagreements (of which there were very few) were resolved by discussion. Where it was clear from the syllabus that a specific section of the examination is dedicated to examining skills in evidence based medicine, we obtained sample or past papers if available.

These skills were not substantially assessed in seven of a total of 17 syllabuses, for five out of 15 major specialties represented: general medicine, surgery, paediatrics, ophthalmology, and radiology. (We did not regard assessment of basic statistics alone as adequate for step three.) A dedicated section of the examination explicitly assesses these skills in five syllabuses (table). Most of these five syllabuses focus on the candidates' ability to evaluate the evidence for validity, but emphasis varies on the skills in evaluating the evidence for its usefulness, formulating a clear clinical question, searching for relevant literature, and implementing useful findings in clinical practice. In the remaining five syllabuses in which skills in evidence based medicine are mentioned (see tables on BMJ website), there are no dedicated procedures for examining these skills in anaesthetics, obstetrics and gynaecology, and oncology; in pathology and occupational health, candidates have to submit a dissertation with original research data.

Examinations with one or more dedicated sections for examining elements of skills in evidence based medicine explicitly

View this table:


One third of the specialties do not assess skills in evidence based medicine in their examination system. Examinations often exert a steering effect on the curriculum, and it would be difficult for future doctors to keep their professional knowledge and skills up to date unless these skills are learnt during training and are regularly applied in clinical practice.

Two thirds of the postgraduate examinations have no dedicated sections for the examination of skills in evidence based medicine. Even in syllabuses where these skills are examined, this is generally limited to skills in evaluating the evidence for validity. However, to practise evidence based medicine effectively, skills in formulating a clear question, evaluating evidence for its usefulness, and applying findings to clinical practice are equally important.2

We acknowledge that royal college examinations are only one element of specialist training and that revision of some syllabuses may be under consideration. Nevertheless, the strategies of the royal colleges for assessing skills in evidence based medicine should be reviewed, possibly by following good examples of the few that assess these skills in some depth.


Contributors: PW initiated the study and contributed to the study design, analysis, and writing of the paper. WCL contributed to the study design, collation of the syllabuses, analysis, and drafting of the paper. WCL is the guarantor for the study.


  • Funding None.

  • Competing interests None declared.

  • Embedded Image Tables showing examinations in which skills in evidence based medicine are and are not assessed are on the BMJ's website.


  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.