Intended for healthcare professionals

Opinion

Doctors should not be simply the ghost in the machine

BMJ 2024; 385 doi: https://doi.org/10.1136/bmj.q899 (Published 19 April 2024) Cite this as: BMJ 2024;385:q899
  1. Matt Morgan, consultant in intensive care medicine in Cardiff, editor of BMJ OnExamination,
  2. Kieran Walsh, clinical director at BMJ
  1. @dr_mattmorgan

“Sometimes the questions are complicated and the answers are simple.”

Dr Seuss

Saying the word “finals” to any doctor fills them with a range of emotions, even though the old “big bang” exam at the end of training has largely disappeared. Now it seems the General Medical Council (GMC) wants to rationalise medical examinations even further.

In its recent statement on the future of medical education, the GMC says “With up to date information at our fingertips via trusted sources on our smartphones we don’t need the huge repository in our heads from textbooks and lectures. Content in the curriculum can be streamlined.”1 But does the march of technology, including artificial intelligence (AI), really force us to simplify the medical curriculum and to take things away? What if instead, it allows us to add something more—something more important? Perhaps technology will help us to free up doctors’ time so they can concentrate on the more human aspects of medicine, such as the interpersonal skills and relationships that are so vital to patient care.

It has long been recognised that tests of knowledge are important, but not a sufficient measure of the process of becoming a safe doctor. Of course, the role of a doctor is much richer than simply providing a diagnosis or a treatment plan. My main strength in working as an intensive care consultant is the ability to orchestrate a complex care team to focus on the patient in front of me. This, combined with the humanity to communicate effectively with patients, families, and staff, is what makes me (currently) more valuable as a clinician than chatGPT.

And rather than use technology to “streamline” the medical curriculum, why not use it to improve the skills, insights—and most importantly the compassion, consideration, and understanding—that doctors will need in this brave new world?

BMJ has recently developed a new tool, in partnership with skilled doctors and SimConverse, to help doctors prepare for the exams that test these human parts of our role.2 Rather than AI simply replacing textbooks, we are using natural language processing to help doctors revise for the Practical Assessment of Clinical Examination Skills exam. This is an assessment that prioritises communication skills and patient welfare in a clinical setting. The new tool helps doctors to practise their communication and consultation skills by interacting with a range of simulated patients. While this demonstrates a way in which AI can help doctors improve their skills, we must acknowledge that even this tool can’t replace the experiential learning that comes from “hands on” real life medicine, just yet.

In addition, current medical examinations focus on questions where an answer can provide certainty. But instead, we need to move to a process that assesses a human doctor’s ability to ask and answer questions when the answers are neither right nor wrong. The ability to ask these difficult questions, and grapple with the uncertainty that medicine throws up, is more important than the ability to provide definitive answers. For example, every day I encounter issues when people confuse the ability to provide an invasive treatment with the more important question of whether it should be provided at all. Understanding the complexities of patient care and being able to critically appraise a situation and help patients to navigate their way through, is something that technology cannot replicate.3

That said, perhaps doctors can use technology to improve on what they already do well. Medicine is all about understanding a patient’s story. We hope that this new tool will help clinicians to hone their communication and consultation skills further and discover new ways to find out about other people’s stories.

Doctors should not be simply the ghost in the machine. Ghosts are mostly invisible, scary, and made up. As we move into the age of the machine, doctors should be visible, safe, and as real as the patients who need their help.

Footnotes

  • Competing interests: MM and KW work for BMJ and helped to develop its new AI tool. KW is the clinical lead of the medical education and clinical decision support products at BMJ.

  • Provenance and peer review: not commissioned, not externally peer reviewed

References