Intended for healthcare professionals

Editorials

Revalidation for doctors

BMJ 1998; 317 doi: https://doi.org/10.1136/bmj.317.7166.1094 (Published 24 October 1998) Cite this as: BMJ 1998;317:1094

Should reflect doctors' performance and continuing professional development

  1. John Parboosingh, Director, professional development.
  1. Royal College of Physicians and Surgeons of Canada, Ottawa, Canada K1S 5N8

    The present public demand for periodic revalidation of doctors is inevitable. The tradition of graduating from a training programme and obtaining a licence for life seems naive in this era when the quality of care we provide is so dependent on our efforts to keep up to date. It is considerations such as these that have prompted Britain's General Medical Council to open discussions with the Academy of Royal Colleges and other professional bodies on the concept of regular revalidation of doctors on the specialist and generalist registers.

    The objectives of periodic revalidation are to encourage doctors to respect changes in societal values and integrate into their practices innovations that are shown to enhance patient care and also to give recognition to doctors who meet national standards of competence and performance. Delays in establishing such systems are understandable. In many countries regional shortages of specialists and primary care doctors will inevitably complicate the implementation of mandatory revalidation of doctors working in regions of greatest need. More importantly, the standards of competence and performance incorporated into a revalidation process must be sufficiently rigorous to distinguish reliably between those who should and those who should not be ministering to the sick.

    Twenty two of the 24 boards of the American Board of Medical Specialists issue time limited certificates for periods of seven to 10 years.1 Although different for each specialty, in most cases the recertification process involves a test of the doctor's knowledge and problem solving skills using multiple choice examinations. Knowledge testing, according to Weed,2 encourages the memorising of facts, a practice which should be discouraged, especially in the face of ever increasing quantities of new information. Instead, Weed recommends, doctors should be evaluated on their ability to find, integrate into practice, and communicate specialised information—a new skill set termed information literacy.3 High administrative costs and the demand for evidence of reliability and validity that will withstand threats of litigation have prevented the US boards from introducing methods of assessing clinical reasoning and communication skills.

    In contrast, postgraduate colleges in Australia and Canada have elected not to incorporate formal examinations into their recertification processes on the grounds that legally defensible examinations assess a limited range of competencies. Also, the initial certification process, taking into account cumulative evaluations over many years of training, incorporates more than a single examination. Instead, maintenance of certification is based on participation in educational and quality improvement activities. Traditional, provider centred continuing medical education that updates doctors' biomedical knowledge is replaced by learner centred activities that facilitate team learning and performance enhancement in multidisciplinary practice settings.4 The Royal Australasian College of Physicians has led the way in incorporating criteria that relate more closely to doctors' performance than attendance at traditional continuing medical education activities. Participation in quality improvement activities, such as practice audits, and the college's physician assessment programme, in which ratings from peers are sought on a range of professional and personal attributes in the practice setting, is essential for continuing certification.

    In this era of accountability and physician mobility, the idea of recording, in one comprehensive monitoring system, the undergraduate and postgraduate training experiences, specialty or generalist certification, and activities used by doctors to enhance their professional development is attractive. The recently initiated American Medical Accreditation Program (AMAP)5 helps doctors to avoid the repetitive task of providing professional data to multiple organisations. Although still at the developmental stage, AMAP recognises the need to move from the traditional, single event, “snapshot” assessment to continuous monitoring of competencies and performance over time.

    In a programme aspired to in Canada we have proposed a programme of continuous recertification or revalidation that relies on accumulated data from doctors' practices.1 In this system doctors will be required at regular intervals to submit the summaries of selected patient encounters extracted from electronic records. Reflecting local health problems, the selected clinical conditions may change from one year to the next. Patient and peer assessment surveys will be used to assess interpersonal and doctor-patient communication skills. Records of individual doctors' activities geared towards practice improvement constitute the second component of the proposed revalidation system. Doctors will be required to use simulators to test themselves on a wide range of skills and competencies, selected on the basis of a practice profile that is derived from their database of patient encounters. Elwyn predicts that professional and practice development plans, a proposal still in its infancy,6 will call for the construction of learning portfolios for all the practice team (doctors, nurses, and managerial staff).7 As well as providing documentation for periodic revalidation, electronic learning portfolios, already in use in the MOCOMP programme in Canada,8 will facilitate the link between continuing learning and performance enhancement. One advantage of the proposed system of revalidation is that focused educational support can be offered at an early stage to doctors who fail to achieve peer accepted standards of practice.

    Advances in computer technology should make the scheduling of periodic revalidation relatively simple. One option is to establish a five year schedule for specialists to provide their postgraduate colleges with computerised summary reports of practice experiences. Much of the scheduling can be automated and specialists would have automatic reminders about what information is needed and how to send it.

    Periodic revalidation is likely be introduced in most countries in the coming years, even before the systems have been shown to enhance patient care. The challenge is to find ways of monitoring the competencies expected of doctors in the next millennium while bearing in mind the wise advice offered by Cameron9: “Not everything that counts can be counted and not everything that can be counted counts.”

    References

    View Abstract