Authors: Faris Hussain, junior doctor, Cardiff University School of Medicine, Wales and Phil Smith, consultant neurologist (and lead for written examinations), University Hospital of Wales, Cardiff.
Doctors in training are often asked to write single best answer (SBA) questions, which are used to test medical students’ and trainees’ skills in coursework and final examinations of most medical undergraduate and postgraduate qualifications.
Most doctors in training can identify appropriate content upon which to base questions, but despite having previously answered hundreds of questions of this type, many find it hard to write high-quality versions themselves.
A good understanding of how SBA questions are developed and constructed also helps candidates to answer this type of questions in exams – see Tips for answering single best answer exam questions for advice on preparing for these SBA examinations.
SBA questions need to be succinct, precise and have appropriate content to ensure reliable and valid outcomes from assessments, and to reward the best candidates when applying their clinical knowledge. Clinicians writing SBA questions should therefore:
Focus on common and important topics. SBA questions are not meant to catch out candidates; they should relate to topics that are common and important in the practice of the person being assessed. For example, for a medical undergraduate, common topics are those frequently encountered in the practice of an F1 doctor (first year of the foundation training programme), such as pre-operative assessment and administering intravenous fluids; whereas important topics are situations that are potentially life threatening or life changing, such as meningitis, suicidality and obstetric emergency.
Follow a consistent style. All SBA question stems, lead-in questions and answer options should be written to a consistent style within the same assessment. This ensures the test solely assesses a candidate’s knowledge and application of that knowledge, rather than their ability to interpret language discrepancies or to cope with distractions, such as inconsistent ordering of clinical material.
Assess learning objectives. Question content should align with the curriculum and the course learning objectives for that course. Tagging questions to learning outcomes allows indexing and helps to avoid topics being over-represented or absent from the question bank.
SBA question structure
A clear understanding of how SBA questions should be structured, and the role of each component is essential when writing this type of question. A SBA question comprises three components: the question stem (scenario); the lead-in question; and the answer options.
1. The question stem
For assessments of doctors and medical students, the stem is a clinical or science-based scenario of a situation that clinicians might encounter in practice, so should be written with the following in mind: (1)
Reflect good and realistic clinical practice. The clinical scenario should reflect real life and exemplary care. Although poor clinical practice is common, SBA questions should generally reflect good practice in order to ensure candidates learn best practice and apply this in real-life settings.
Ensure all options are plausible. The stem should contain sufficient information to make all five answer options plausible, and not include “decorative” information unrelated to one of the answer options (even though such additional information will be present in real life). This minimises distraction and avoids assessing anything other than application of clinical knowledge.
Set the question length. Each question (stem, lead-in and answer options) should take no longer than 45 seconds to read and answer, since candidates usually have only 60–90 seconds for each question. Providing detailed investigation results and/or a picture helps to reduce words in the stem.
Consistent structure. SBA questions must reflect assessment in practice:
The order of information should follow a standard format. First (for all clinical SBA questions), the age and sex of the patient, followed by the presenting complaint, then (depending on their relevance) the past, family and social history, and finally examination and investigations.
The presenting complaint is the reason for the clinical interaction, so should appear in the opening line.
Consistent normal values must be applied across all relevant questions, and these ranges should be up to date and match what is available and used in real clinical practice.
Consistent writing style. SBA questions within an assessment should follow the same style:
Economy, precision and clarity of language. The stem should be written in plain English so the information is clear and precise.
Consistent terminology. The same terms should be used across questions, for example description of a patient’s sex (for example, “woman,” rather than “lady” or “female”); the use of consistent terminology also applies to medical terms and their spelling.
Present tense. Relevancy and urgency are often more concisely conveyed in the present tense.
Active sense. Questions in which the subject performs an action are usually easier to understand, less formal and shorter than those written in the passive sense (where something is done to the subject). For example, “the doctor administers x medication,” is active whereas “the patient is given x medication” is passive.
Raw data over formulation. Actual data or pictures reflect clinical practice better than their verbal formulation or interpretation. For example, “her temperature is 38.5°C and pulse rate is 121 bpm” requires more clinical knowledge and experience than “she is pyrexic and tachycardic.”
2. The lead-in question
The lead-in question asks the candidate to solve the clinical or science-based problem outlined in the question stem, using the information that was presented.
A good lead-in question has the following qualities:
Is answerable without seeing the answer options (cover test). To answer a question using the cover test, cover the five options and attempt to answer the SBA question using only the stem and lead-in question. The question should be answerable solely from the stem, without needing the prompt of possible options. The very best SBA questions will easily convert to “very short answer” questions, with no answer options at all. (2)
Is answerable only with the stem. The lead-in question should be answerable only alongside the stem. If the stem (including any picture or data) were concealed, the lead-in question should not be answerable. For example, “What is its most likely adverse effect?” requires reading of the stem to work out the correct answer. Conversely, “What is the most common adverse effect of ramipril?” is a freestanding question that makes the stem redundant.
Expresses uncertainty to reflect clinical practice. Asking for the “most likely” diagnosis or the “most appropriate” investigation implies that all the options are plausible but that one is the “most correct.” This formulation will more likely test application of knowledge rather than knowledge recall.
Is posed as a question (rather than a statement) and so ends with a question mark.
Is a positive question. There is little value in being able to identify the “least likely” answer when testing application of clinical knowledge. A negative question also fails the cover test, as the number of “least likely” answers is almost infinite.
3. The answer options
To reduce the risk of candidates being drawn to a particular answer, it is important to present answer options to a SBA question in alphabetical or numerical order, to write them each in similar format and for them all to appear plausible. The options presented should:
Include the truly best answer. To pass the cover test, the option list must include the very best answer, selected according to clinical practice guidelines and expertise. Omitting the very best answer obliges the candidate to look first at the answer options, which is misleading and makes the question fail the cover test.
Have an agreed “most correct” answer. The “most correct” answer (usually of five) may be 80% correct, with the remaining distractors perhaps 20–30% correct. These conditional probabilities compare well to a doctor’s real clinical practice when considering decisions involving diagnosis and management.
Belong to the same domain/address the same topic. The answer options should all be from the same topic or domain. This makes discrimination more challenging for the candidate, and avoids risking choosing from two equally good answers from different domains (eg, an investigation and a treatment).
Look similar. Answer options should be of similar length (preferably short) and each have a similar number of components (usually one).
Written with economy and precision. Shorter answer options are preferable. The same word(s) or phrase appearing in each option might be moved to the lead-in to shorten the options.
Before SBA questions can be included within a test or examination, it is essential to ensure they are appropriate for the target group of candidates. This can be achieved as follows:
Peer review questions in supportive groups. Discussing questions with a group of peers with varied experiences and specialities improves the quality of questions by allowing critique and development. Multiple pairs of eyes screening questions allow them to be checked for accuracy, formatting and style.
Explanations and references for questions. Each question should have an accompanying explanation that is well supported by academic references. This ensures that at every stage, including post-test review, the best option is well justified and the incorrect answers are rationalised. Depending on the context of the exam where the question is used in, this explanation may or may not be provided to the candidate for feedback.
A good structure for writing explanations includes:
Justification for correct answer: linked to clinical scenario.
Justification for each incorrect answer: explained in alphabetical or numerical order (ie, the order of the distractors excluding the correct option).
Academic references: to support the correct answer and the incorrect answers.
NICE (or other recognised body) guideline reference.
Using references ensures that questions are clinically accurate and related to up-to-date guidance while also guiding further reading.
Students are often provided with an “overview” summary of each question that gives its main clinical point without disclosing the question details (eg, “Management of acute coronary syndrome” or “Diagnosis of chest pain”). Linking questions to specific curriculum learning outcomes and distributing this to students is still useful.
Avoid rewarding “test-wise” candidates
Candidates who have taken lots of SBA tests might be able to apply particular strategies to try to answer SBA questions without using or needing clinical knowledge. Knowing that these strategies exist and how they work, can help clinicians to write better questions that ensure that candidates must apply appropriate knowledge to answer them correctly. To prevent candidates from “gaming” the system, clinicians writing questions should:
List answer options in alphabetical or numerical order, thereby avoiding any advantage in anticipating a specific order.
Avoid convergence strategy prompts. Convergence strategy uses the way that distractors might inadvertently point to the correct option. Here are three examples:
In an ordered list of numbers, the middle option is most likely to be correct, as the author has tried to conceal it by using higher and lower numbers as distractors. Authors should therefore ensure that among a list of numbers, the middle option is correct only 20% of the time.
If each answer option has two components, students may gamble that the correct two components will be those that appear most frequently among all the options. For example a question with the following five options – 1. A +B, 2. B+C, 3. B+D, 4. B+E, 5. C+E – might prompt a candidate to gamble on answer 2 because B and C appear most frequently across all the options. Authors should therefore limit the number of times the correct components appear among the distractors.
If two answer options appear to be opposites, then one of them is probably correct. For example, having answer options “CT scan with contrast” and “CT scan without contrast” invites a 50/50 guess. Authors should therefore avoid the correct answer being one of an opposite pair or avoid using opposite pairs in options completely.
Ensure all distractors are plausible. Options that appear completely implausible or ridiculous reduce the number of options, making the question easier.
Avoid the correct answer looking different. Testwise candidates know that the longest or most detailed option is likely to be correct. Authors should try to ensure answer options are of similar length.
Avoid options that use extremes in language. Testwise candidates know that words like “always”, “never”, etc, rarely apply in medicine and so answers containing these words are incorrect. “None of the above,” or “All of the above,” are poor options – not only do they fail the cover test, but “All of the above,” will likely alphabetise to the top of the list.
Avoid cueing the answer. An unusual word appearing in both the lead-in question and in an answer points to that answer being correct.
- Smith PE, Mucklow JC. Writing clinical scenarios for clinical science questions. Clinical Medicine 2016; 16(2): 142.
- Bala L, Westacott RJ, Brown C, Sam AH. Twelve tips for introducing very short answer questions (VSAQs) into your medical curriculum. Medical Teacher 2023; 45(4): 360-7.