Reviewing the ARCP process: experiences of users in one English deanery
BMJ 2012; 345 doi: https://doi.org/10.1136/bmj.e4978 (Published 03 August 2012) Cite this as: BMJ 2012;345:e4978- Andrew Eynon-Lewis, associate dean,
- Martin Price, associate dean
- andrew.eynon-lewis{at}nhs.net
Abstract
Andrew Eynon-Lewis and Martin Price discuss the results of their survey of trainees and senior educators in the South West Peninsula Deanery
If you are a specialty trainee you will probably have had your progress in training assessed recently by your school’s annual review of competency progression (ARCP) panel. So how was it for you? What outcome would you give the ARCP process?
Remarkably for this high stakes review, we found only one study that examined the experiences of users of the ARCP process.1 This article considers what might be learnt from the findings of two surveys done in the South West Peninsula Deanery.
ARCP process
Specialty trainees in the United Kingdom are assessed through a competency based assessment approach that requires trainees to receive, at minimum, an annual summative assessment of their progress through training.2 Trainees are expected to submit evidence of their learning and progression—such as details of their workplace based assessments, a learning log, research, results of formal examinations, and information on any audits they may have done—in their portfolio, which is increasingly electronic. An ARCP panel, consisting of senior educators in the relevant specialty, awards a satisfactory or unsatisfactory outcome on the basis of this evidence.
This process allows the ARCP panel to judge a trainee’s progress remotely, with only those trainees likely to be awarded an unsatisfactory outcome met face to face. In many specialties, however, the panels meet all trainees, with the meeting focusing on the submitted evidence and not acting as an assessment in itself.
From December 2012, all doctors in training must also collect evidence to support their revalidation,3 which will be documented on an amended version of the “form R” that all trainees complete annually before their ARCP. This evidence, which will be cross referenced with that provided by employers of the trainee, will be available to the ARCP panel. It will contribute to the ARCP outcome awarded, with a trainee’s learning and reflection on any adverse evidence also contributing to the outcome.
Trainee survey
In September 2011 all specialty trainees in the South West Peninsula Deanery progressing through the ARCP route were invited to complete an online survey. The 202 respondents (response rate 20%) represented all schools and training programmes and were broadly representative of the trainees in the deanery and of the outcomes awarded by their respective school.
Trainees weren’t entirely happy with the communication of the ARCP process. Many wanted clearer information about the requirements of the ARCP at the start of the training year, with some reporting that these requirements were “always changing.” (This may have been true, because some colleges finalised their curriculum and assessments during the period covered by the survey.)
Trainees were mostly happy with the way in which the evidence they had provided was assessed: 92% of respondents indicated that it was “fairly” or “very fairly” assessed in deciding the outcome.
Most trainees who had a face to face meeting with an ARCP panel valued it. A few trainees said that the panel seemed unclear about which outcome should be awarded, and some trainees expressed uncertainty regarding outcomes. Some trainees were unclear how the assessment of their evidence related to the outcome awarded. Trainees showed some recognition of the inconsistency in how panels and educational supervisors perform. Seventy eight per cent reported being awarded their outcome at the end of the panel meeting.
Most trainees had had an opportunity to discuss their progress outside the formal ARCP assessment process. Seventy per cent said that the panel encouraged them to provide feedback on their training and that discussion went beyond assessment to include other aspects of training, such as career guidance. Twenty respondents used the space for free text comments to state that they were “happy with the process.”
Many trainees asked for “better” feedback on their progress—they wanted to know how well they were doing, not just whether they were “satisfactory or not.”⇓ Some thought the assessment process had become a tick box exercise with judgments made on the number of workplace based assessments rather than quality.
Fig 1 What trainees want from the ARCP process (free text comments)
Trainees’ comments on the ARCP panel meeting
“[My meeting] lasted 15 minutes; it did seem as though I had travelled 3 hours for what could have been a paper ARCP”
“I don’t think it necessary to waste time [looking] at reams of paperwork in a meeting. Outcomes should be assessed beforehand . . . [leaving] meeting time for more useful issues”
“I think this panel was very balanced and represented what the ARCP should be. Others have not been . . . the strength and validity of the process is based on panel members present”
“Panel(s) need to understand how to [provide] feedback; the importance of giving structured feedback seems to be lacking in a big way”
“[The meeting] could be less confrontational and criticism should be constructive. I am unclear if ARCP is a review of how things are going or an interview where you have to sell yourself and justify your existence”
Senior educators survey
In the online survey of senior educators (heads of schools and training programme directors) conducted in January 2012 we explored some of the issues raised in the trainee survey. The responses from 43 of the 57 senior educators invited to participate (75% response rate) ensured that all schools and most training programmes were represented.
Although respondents reported that processes were in place to inform trainees and educational supervisors of what is required for the ARCP, confidence in how well these were understood by these groups was not high.⇓⇓ Senior educators reported that trainees were often underprepared for their first ARCP.
Fig 2 Senior educators’ confidence in whether trainers understand what is required of them from the ARCP process
Fig 3 Senior educators’ confidence that trainees understand what is required of them from the ARCP process
Generally, senior educators reported assessing “most” or “all” of the trainees’ evidence before the ARCP panel meeting, with few respondents saying that no evidence was read. Workplace based assessments and educational supervisors’ reports were the two most influential pieces of evidence in determining outcomes. For workplace based assessments, lack of numbers rather than quality led to unsatisfactory outcomes being awarded, because “minimum numbers are mandated in the curriculum.” Some respondents agreed with our trainees’ concerns, recognising the need to “move away from tick-boxes and back to people.” Seventeen respondents (40%) had concerns about the quality of the educational supervisor report, but most were not offering feedback on these reports.
Most educators valued the panel meeting with trainees, and most meetings were timetabled for 20-30 minutes, with some flexibility built in. A briefing of members before the panel and “externality,” the process of being visited by another deanery, were reported as important in securing quality and achieving consistency. Nevertheless, over a third of educators had concerns about consistency of panels. Most respondents gave trainees their outcome at the end of the meeting. Half of respondents reported that additional evidence presented by the trainee at the meeting influenced the outcome.
Respondents were uncertain about the value of the ARCP in identifying underperforming trainees and supporting excellent trainees. One said, “Trainees can get boxes ticked but still not be confident or competent doctors,” and another commented, “ARCP tends to bring people to the average—[it’s] difficult to shine.”
Learning from experience
From this limited study, we sense that users of the ARCP process are generally familiar and confident with the process and can provide some meaningful critical evaluation of it. Senior educators recognise the need for further training in doing an ARCP (it is a General Medical Council requirement that all supervisors are trained) and that this needs to be extended to educational supervisors and clinical supervisors to improve the quality of their respective reports for the panels.
Trainees requested clear information on the ARCP process. Educators indicated that this information was provided but lack confidence in how much is understood by trainees.
Trainees generally think that the process is fair and appreciate the potential value that a meeting with the panel can offer, but they call for feedback that is less formulaic and more individualised. Awarding the outcome at the start of the meeting, which would require all the trainee’s evidence to be submitted beforehand, would provide the opportunity for a formative dialogue to emerge, moving the meeting on from an “assessment of learning” to an “assessment for learning” and supporting the educational appraisal and annual planning elements of the process.
Given the increasing use of electronic portfolios and the opportunity this offers for evidence to be assessed remotely (the norm in schools of general practice), is it worth it for panels to meet all trainees, given the disruption to service delivery this entails? Would it be a better use of resources if schools that meet all their trainees adopted selective policies to target those trainees who are likely to gain most from meeting the panel? The indication from our senior educators is that the process may not be robust enough to identify this group. Other options are that trainees who are awarded satisfactory outcomes are given the opportunity to request a panel meeting and that a randomly selected subgroup of trainees meet an ARCP panel.
Footnotes
Competing interests: AEL sits on ARCP panels for the South West Peninsula Deanery’s school of general practice and MP sits on ARCP panels for the school of anaesthetics. Both are responsible for assessment in the deanery.