Intended for healthcare professionals

Research

Admissions processes for five year medical courses at English schools: review

BMJ 2006; 332 doi: https://doi.org/10.1136/bmj.38768.590174.55 (Published 27 April 2006) Cite this as: BMJ 2006;332:1005
  1. Jayne Parry, senior clinical lecturer1 (J.M.Parry.1{at}bham.ac.uk),
  2. Jonathan Mathers, deputy director2,
  3. Andrew Stevens, professor of public health2,
  4. Amanda Parsons, research assistant2,
  5. Richard Lilford, professor of clinical epidemiology1,
  6. Peter Spurgeon, professor of health services management3,
  7. Hywel Thomas, professor of economics of education4
  1. 1 Department of Public Health and Epidemiology, University of Birmingham, Edgbaston, Birmingham B15 2TT
  2. 2 Health Impact Assessment Research Unit, Department of Public Health and Epidemiology, University of Birmingham
  3. 3 Health Services Management Centre, University of Birmingham
  4. 4 School of Education, University of Birmingham
  1. Correspondence to: J Parry
  • Accepted 2 February 2006

Abstract

Objective To describe the current methods used by English medical schools to identify prospective medical students for admission to the five year degree course.

Design Review study including documentary analysis and interviews with admissions tutors.

Setting All schools (n = 22) participating in the national expansion of medical schools programme in England.

Results Though there is some commonality across schools with regard to the criteria used to select future students (academic ability coupled with a “well rounded” personality demonstrated by motivation for medicine, extracurricular interests, and experience of team working and leadership skills) the processes used vary substantially. Some schools do not interview; some shortlist for interview only on predicted academic performance while those that shortlist on a wider range of non-academic criteria use various techniques and tools to do so. Some schools use information presented in the candidate's personal statement and referee's report while others ignore this because of concerns over bias. A few schools seek additional information from supplementary questionnaires filled in by the candidates. Once students are shortlisted, interviews vary in terms of length, panel composition, structure, content, and scoring methods.

Conclusion The stated criteria for admission to medical school show commonality. Universities differ greatly, however, in how they apply these criteria and in the methods used to select students. Different approaches to admissions should be developed and tested.

Introduction

Medicine remains one of the more oversubscribed university courses. Failure to gain admissions despite attaining the highest academic grades can lead to understandable resentment and perceptions of bias and unfairness in the system.1 Lumsden and colleagues suggested that the admission to medical schools in the United Kingdom was based on procedures that were often “secretive and varied.”2 Others have suggested discriminatory practices exist.3 4

The recent independent review of admissions to higher education (the Schwartz report) emphasised the need for a fair and transparent admissions system, especially for subjects such as medicine where demand exceeds supply and where it is difficult for staff to select from a growing pool of academically highly qualified candidates.5 While emphasising its belief in the autonomy of institutions over admissions policies, the report recommended that all admissions systems should strive to use assessment methods that are “reliable and valid.”

Admissions tests used to select medical students in other countries

In the United States, requirements for admission to medical school vary from school to school and include minimum academic levels (indicated by undergraduate grade point averages), performance in the medical college admissions test (MCAT), and interview6 to identify one or more of a range of non-academic characteristics.7 A similar approach to selection is seen among the 17 Canadian medical schools.8 In both the US and Canada, medicine is read as a postgraduate degree. This is not the case in other countries—for example, in Australia, admission may be as a postgraduate or directly from high school. None the less, the methods adopted in selection again comprise a combination of minimum academic attainment, cognitive testing, and interview.9 In Europe, there is even greater heterogeneity—for instance, in the Netherlands, medical schools may select a proportion of entrants to their course via interview and other methods, but the remaining candidates are identified through a lottery among school leavers weighted for academic attainment.10

The heterogeneity in selection processes exists both between and within countries. We examined diversity in England as a first step towards meeting the need for developing and testing different approaches to selection. Specifically, we identified the stated criteria for admissions, the tools with which such criteria are sought, and the processes by which those tools are applied.

Methods

Consent for participation

The details of the study were initially presented at a meeting of the Council of Heads of Medical Schools where verbal consent to proceed with the work was provided by all present. We then wrote to the heads of all the schools involved in the national expansion programme in England setting out further details of the review and requesting formal consent for participation. In 1997, the medical workforce standing advisory committee (MWSAC) published a report recommending an increase of 1000 places at medical school to meet the future demand for doctors within the NHS. The government accepted the recommendation and charged a joint implementation group (JIG) consisting of representatives of the Higher Education Funding Council for England and the Department of Health to coordinate the allocation of these additional places.

Once written consent was provided, the heads of schools were asked to nominate a member of staff as contact person—usually the admissions dean or other senior staff member responsible for overseeing the school's admissions process. We then contacted this person and explained the purposes of the review. All schools agreed to participate in the review.

Information collected in the review

We mapped out a generic “pathway” from receipt of the student's application form from UCAS (the Universities and Colleges Admissions Service) by the school through to making an offer to a candidate who had applied to join the five year medical course. We identified key points on this pathway and developed questions necessary to ascertain the nature of the process at these points. The questions were collated into a single proforma, which was used as the template for all subsequent data abstraction.

Information sources

Our first step was to collect and review all documents and information in the public domain relating to each school's selection process. These included prospectuses (both hard copies and electronic) and other documents made available through school and university websites. AP reviewed the documents and abstracted and entered all relevant information on to an electronic copy of the proforma.

Once each school's documents had been reviewed, we forwarded a copy of the proforma including the abstracted information to the school's contact person for checking. We arranged a telephone interview with the contact person to go through the proforma and in particular to fill in any gaps. All telephone meetings were conducted by the same person (AP) and arranged at a time convenient for the contact. The interview was audiotaped (with consent) to provide a contemporaneous record and a back up to the hand written notes taken during the discussion. Information from the completed proforma was transferred to an electronic database. Where necessary, we contacted the interviewee by email or telephone, or both, for further clarification. All schools were provided with copies of their own proforma plus a copy of the review report (on which this paper is based) to provide an opportunity to check for factual errors and to comment on our observations.

Results

Twenty two of the 23 English schools participating in the expansion programme admit applicants to a five year “traditional” medical course (one institution, not a separate medical school, admits students to study for phase I (two years) before being integrated with students from another school for phase II (three years) of the programme and because of this process we considered it as a separate medical school). Within this group are two schools that offer a non-foundation six year course, which includes a one year BSc/BA degree. We grouped schools into five categories depending on the steps taken to process the UCAS form and to decide whether or not to make an applicant an offer (table 1).

Table 1

Summary of processes undertaken by medical schools offering five year “traditional” course

View this table:

Selection criteria

Academic criteriaTable 2 shows the academic criteria used by schools to identify applicants who would progress further in the assessment process. High A level grades (A or B in the exams taken at age 17-18) in two or more science subjects are a common requirement, but there is discordance with regard to the acceptability of A level re-sits (that is, exams passed at the required grade on a second or subsequent attempt).

Table 2

Academic criteria for admission to English medical schools

View this table:

Non-academic criteria—With the exception of two, all schools considered some aspect of non-academic criteria when assessing the student's personal statement and the referee's report presented in the UCAS application form. The non-academic criteria identified by each school varied in terms of number and nature but there was some commonality in terms of evidence of motivation for, and a commitment to, medicine; team working, leadership, and the acceptance of responsibility; a range of extracurricular interests; and experience of working in health or social care settings.

Short listing candidates for interview

The number of people involved in assessing applicants' UCAS forms ranged from one to 30 across schools. In four schools this involved lay people—that is, people who were not member of the academic or clinical staff, such as a local headteacher. Only 11 schools offered training for people assessing the UCAS forms, and within this group the nature of training varied considerably (table 3). Among the schools that assessed non-academic criteria, 13 either double mark each form or involve the use of a second assessor if a form was rejected by the main assessor.

Table 3

Shortlisting candidates for interview at medical school

View this table:

Method of assessing the UCAS form

Of the 20 schools that scored UCAS forms on academic and non-academic criteria, 11 have complex scoring systems based on the allocation of marks (often from 0-3 or 0-5) for a set of predefined criteria, with written guidance to assist scoring. At six other schools assessment is less complex and aims to divide applicants into those who should or should not be called for interview, and those who are borderline. Of the remaining schools, one assesses the UCAS form on a combination of GCSE (general certificate of secondary education, taken at 15-16 years) results, predicted A level grades, and healthcare experience with other non-academic criteria coming into play only for applicants who fall short on this initial scoring, while another has developed an online questionnaire that all applicants complete. This is scored electronically, and the results are coupled with an assessor's scoring of the referee's statement. The last school did not reveal details of how the form was assessed.

Two schools use the biomedical admissions test (BMAT, a two hour paper developed by the University of Cambridge local examinations syndicate that tests critical thinking; http://www.bmat.org.uk/) to provide information additional to that presented on the UCAS form to select candidates, with one of these also requiring candidates to provide further background A level module scores. One other school has trialled the personal qualities assessment (PQA) tool but at present does not use it as part of the formal selection process. The PQA is an instrument designed to assess a range of personal qualities considered to be important for the study and practice of medicine, dentistry, and other health professions (http://www.pqa.net.au/). It comprises questions, grouped into three sections; the first measures cognitive skills, the other two measure relevant personality/attitudinal traits.

The interview

Who is on the panel?—Only one school suggested that it would be prepared to hold one-on-one interviews but that this would occur only if one of the designated interviewers for that day was unable to attend at the last minute—for example, as a result of illness. All other schools had a minimum of two or three examiners, with two schools explicitly preferring larger panels of four to five people. Interview panels invariably comprised at least one or more senior members of the academic/clinical staff. Ten schools included lay members, and six included senior medical students. All schools attempted to get a mix of men and women on the panel and, when possible, to have panellists from different ethnic backgrounds. As with the training of assessors of the UCAS forms, there was a range of different training provision between medical schools (table 4).

Table 4

The interview process at various medical schools

View this table:

Interview format—Interviews were designed to draw out from candidates replies that would assist in assessing their performance against several prespecified categories (usually but not always correlating with the categories used to assess the UCAS form). Two schools had a standardised process where candidates were asked predetermined questions from a bank formulated by the school. All other schools, bar three, adopted a more mixed approach where questions were both predetermined and also interviewer led depending on the candidate's responses and statements on the UCAS form. At the remaining three schools questions were solely interviewer led but still directed to elicit information on pre-agreed criteria. Two schools have introduced variations to the simple question and answer approach within the interview. One has introduced questions to elicit information on communication skills based on a video viewed by candidates while waiting for their interview; at the other, candidates are given a question to consider, again while waiting for their interview, and which they are told they will be required to answer during their interview. Other schools are considering the introduction of problem solving tasks and group work as part of the interview process, though these components are not yet in place as part of the actual assessment.

Discussion

Our review suggests that there is commonality across schools with regard to the criteria used to select future students: academic ability coupled with a well rounded personality demonstrated by motivation for medicine, extracurricular interests, and experience of team working and leadership skills. Most schools operate a range of systems based around a two stage approach: shortlisting for interview—on the basis of predicted academic performance and the information presented in the UCAS form—followed by an interview. The implementation of this approach, however, varied substantially: some schools do not interview; some shortlist for interview only on predicted academic performance, while those that shortlist on a wider range of non-academic criteria use various techniques and tools. Some schools use information presented in the candidate's personal statement and referee's report while others ignore this because of concern over bias. A small number seek additional information from supplementary questionnaires filled in by candidates. Interviews vary in terms of length, panel composition, structure, content, and scoring methods.

What methods can be used to select students?

Previous academic performance has been shown both in the UK and the US to predict future academic performance, though correlation with clinical skills and postgraduate performance are less clear.11 12 In the UK, the use of GCSE grades and predicted A level grades as a marker of academic performance has long been the basis of medical school selection. Key studies providing evidence to support this, however, used data from cohorts in the 1970s when students needed grade B in three science subjects.13 Widening acceptance of a non-science subject in combination with science makes it difficult to know the applicability to current and future students. Furthermore, the increasing numbers of students gaining three A grades at A level make differentiation between applicants problematic. Such difficulties have led some to argue for the introduction of intellectual aptitude tests,14 and in this review we identified schools using in-house questionnaires or external tests, or both, as part of the selection process. But others note that in the current context of selection to medical school the highly preselected level of the candidate group makes the contribution of cognitive tests unlikely to discriminate much further, and instead conclude that “using a more finely developed marking system at the top end (A+ and A++, for example) has the greatest potential towards enabling enhanced selection by medical schools' admissions staff.”15

Consideration of non-academic characteristics, such as empathy, conscientiousness, team working, and so forth, have some face validity, but there is no absolute consensus on the characteristics medical schools should be seeking among future doctors—indeed, in a review of admissions processes in the US, Albanese et al noted that 87 different personal qualities relevant to the practice of medicine have been identified.7 Moreover, the literature offers little guidance on how best to assess these characteristics. We found that most schools used a combination of the candidate's personal statement, their referee's report, and an interview, though they were not used in a standard approach across schools. Compelling evidence of the utility of the referee's report and personal statement is lacking,16 and while there is evidence that a structured interview format and the use of trained and experienced interviewers may improve the reliability and validity of the interview, controversy remains as to whether the costs required by this process justify the end point.17 18 This is relevant as schools in the survey reported difficulty in the training and recruitment of staff for interview purposes.

A way forward?

One option to reduce differences between schools in selection processes is the implementation of a centralised admissions system. Such as scenario is not that far fetched: as noted, the use of standardised cognitive tests is already in place at some schools in this survey. Moreover, two schools (Nottingham and St George's) conduct joint interviews for graduate students, and we understand there are discussions among northern medical schools about devising a shared bank of interview questions. But crucially, the present paucity of evidence on which to base selection cautions against the implementation of a single process based on present procedures. Rather, in the era of evidence based medicine, there is a case for developing a system after a process of experimentation and evaluation, the first stage of which is to clarify what type of student we want to select and why. In principle three or four assessment processes could be established nationally with applicants randomised to each and outcomes tracked over time. If a centralised approach is rejected because medical schools want to retain a local system allowing them to recruit a distinctive type of student, however, there is no less a need to more stringently assess the validity of their selection methods in identifying students that meet their local criteria.

What is already known on this topic

A recent review of admissions to UK higher education emphasised the need for a fair and transparent system

This is particularly necessary for entry to medical school, where demand exceeds supply and there is a growing pool of highly qualified candidates

What this study adds

There is no single process for selection at English medical schools and too little evidence to develop one

Developing a clear definition of suitability for medical training is the first priority, whether locally or nationally

Footnotes

  • We acknowledge the support and collaboration of the UK Council of Heads of Medical Schools, and the admissions staff at each school.

  • Contributors JP, JM, AS, HT, PS, and RL conceived and prepared the study protocol for the National Evaluation of Medical Schools project of which this study (review of admissions processes) is a key component. JP, JM, and AP constructed the framework that informed the production of the data proforma. AP undertook the documentary analysis and interviews with admissions tutor, supervised by JM and JP. JP produced the first draft of the paper. All authors commented on subsequent drafts and have approved the final version. JP and JM are guarantors.

  • Funding Department of Health (Evaluation of the National Expansion of Medical Schools project; reference No 160056).

  • Competing interests None declared.

  • Ethical approval The West Midlands multicentre research ethics committee (reference No 04/MRE07/58).

  • Comment made by an admissions tutor The problem was excluding people from interview, not taking them for interview, because they were all so good. That was the problem, you felt heartbroken when you got down to the last 10 forms and you had only three places left. How on earth would you select them? It's terribly difficult, and quite unfair. People would joke that we should throw them all up in the air and invite the first 40 you pick up for interview—it would have been just as fair. Several of my colleagues wanted to introduce lotteries.

References

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
View Abstract