Revalidation for doctors
BMJ 1998; 317 doi: https://doi.org/10.1136/bmj.317.7166.1094 (Published 24 October 1998) Cite this as: BMJ 1998;317:1094All rapid responses
Rapid responses are electronic comments to the editor. They enable our users to debate issues raised in articles published on bmj.com. A rapid response is first posted online. If you need the URL (web address) of an individual response, simply click on the response headline and copy the URL from the browser window. A proportion of responses will, after editing, be published online and in the print journal as letters, which are indexed in PubMed. Rapid responses are not indexed in PubMed and they are not journal articles. The BMJ reserves the right to remove responses which are being wilfully misrepresented as published articles or when it is brought to our attention that a response spreads misinformation.
From March 2022, the word limit for rapid responses will be 600 words not including references and author details. We will no longer post responses that exceed this limit.
The word limit for letters selected from posted responses remains 300 words.
Dear Editor,
The international medical community is increasingly fractured by
rules and regulations which prevent training and working outside one
political block. I have specialist qualifiactions which are valid on both
sides of the Atlantic However, it is clear that there is no willingness
to consider how individulas will revalidate our specialist qualifications
in one continent while working on another. The Royal College of
Pediatrics and Child Health, of which I am a member, sent out voluntary,
guidelines for validation of continuing education. This presumably may
become complusory. I wrote to the College asking whether I could use CME
generated here and approved by the University of Colorado towards such a
validation. Several months later, I have still received no reply. When I
was training in England and the USA in the 1980s, I remember how unhelpful
the Royal College of Physcians was about accrediation of training for
those working both in England and abroad.
I expect all the Royal College to be unhelpful to those who wish to retain
their specialist status in Europe while working abroad. I hope I am
wrong.
It remains my belief that those of us who have trained and worked in more
than one country, have often something extra to offer the institutions in
which we work. It would be a pity if in the rush to validation this issue
is considered not worth of addressing by the Royal Colleges
Yours
Nicholas K. Foreman
Director Neuro-Oncology
The Children's Hospital
Denver, Colorado
USA
will The curent move towards validation
Competing interests: No competing interests
Editor
Despite the large amount that has been written in the BMJ in the last
two years about continuing medical education(1), peer review(2) and
revalidation(3), I have not seen any mention of the Dutch system of peer
review(4)which has been running for 5 years.
Each specialist is required to 'belong' to a group of between 4 and 8
like-minded specialists and it is this group which is reviewed every five
years. The specialists need not be working in the same hospital. Before
the review a questionnaire about facilities, workload, audits undertaken,
etc is completed by the group members. The visiting team consists of three
doctors from the same specialty: the chairman who is a member of the Board
of the specialist society, someone who has been reviewed in the last 3
months and someone who is to be reviewed in the next 3 months.
One of these reviews lasts six hours. As well as discussing the
questionnaire and the audit material with the group, the review team meets
representatives of local management, of the nursing staff and of the local
GPs. Before the final meeting with the group, the review team inspects the
facilities and a sample of the casenotes. The review team sends the group
a draft report of the visit for comment. The final report can include a
recommendation that a further visit should take place in two years. It is
up to the group whether or not it shows the report to management. An
appeal mechanism exists.
The advantages of the system are:
a) group review is much less threatening than individual peer review
b) as the individuals are to be reviewed as a group, they are likely
to meet together and be supportive of each other. This in itself has the
potential to raise the quality of care. In addition, group members can
also be mentors for each other in the preparation of Personal Development
Plans
c) everyone in the specialty is involved as each group has the
opportunity to visit two other centres every five years; the educational
value of such visits is well established.
A visit rarely fails to identify a doctor whose performance is giving
cause for concern. Even if the members of the group stand back-to-back
when meeting the review team, the discussions with management, with the
nursing staff and with the GPs identify the doctor who appears to be
performing poorly.
1. Series on Continuing Medical Education. BMJ 1998; 316: 301-4, 385-
9, 466-8, 545-8, 621-4, 697-700, 721-2, 771-4
2. Irvine D. The performance of doctors. I: Professionalism and self
regulation in a changing world. BMJ 1997;314:1540-2
3. Parboosingh J. Revalidation for doctors. BMJ 1998;317:1094-5
4. van der Baan S.Peer review:experiences of the Dutch Ear-Nose-Throat
Society. Proceedings of the March 1998 conference "CME - making sure it
works". www.open.gov.uk/doh/meconf.htm pp 62-4
Competing interests: No competing interests
Re: Dutch system of peer review
Dear Editor:
The Royal College plans to start on January 1, 2001 the Maintenance of
Certification Program across Canada. As we refine our administration and
evaluation systems, we are keenly aware of the great value of similar
experiences from other countries. National contexts and societal forces
propelling these programs are at the core of how each college approaches
the issue of revalidation, re-certification, or continuing professional
development.
I would greatly appreciate to hear from your readers about best
practices in relation to models of implementation and mechanisms of
evaluation of similar programs, particularly those who have been
functioning for a few years and have the advantage of knowing what works
and what doesn't work.
More specifically: is anyone evaluating the behavioral/clincal
outcomes and impacts of CME or continuing Professional Development
activities? Has anyone followed the pressumed correlation that CME/CPD
produce a higher quality care and better trained professionals?
-FC
Competing interests: No competing interests