Patient commentary: Stop hyping artificial intelligence—patients will always need human doctorsBMJ 2018; 363 doi: https://doi.org/10.1136/bmj.k4669 (Published 07 November 2018) Cite this as: BMJ 2018;363:k4669
All rapid responses
I am a junior doctor working within a fast moving and high pressured environment to treat patients with both accuracy and compassion, and an academic researching human error and the impact of stress upon performance.
The ability of a human to act as an effective doctor is limited by a multitude of factors highlighted in ‘Could artificial intelligence make doctors obsolete?’ including: time limitations in a busy workplace, by limited training in relation to the immeasurable amount of knowledge required, and by cognitive strain and long hours in a stressful environment. Humans are therefore resource limited and utilise functions that a robot would do in order to treat patients:
As doctors we see a great number of similar presentations of the same condition. We gather an internal database of symptoms and signs which indicate the condition's correct diagnosis. Therefore when we see a patient presenting with X, Y and Z symptoms, we can match them to the hundreds of previous patients with X, Y and Z, and diagnose them successfully. In humans this technique is limited by our own personal experience and we can be biased by particularly memorable cases or indeed, patients that we have misdiagnosed who have then gone on to suffer because of it. This technique is utilised in Artificial Intelligence, however with the input of a multitude of patient presentations robots could be able to apply this rule with much more precision and success than a human doctor, improving healthcare provision.
Over the course of our experiences we are met with the same presenting symptoms over and over. One technique of approaching a specific symptom that humans use is frequency gambling.  We realise that when a certain set of X, Y, and Z symptoms occur it most likely leads to diagnosis A, and rarely leads to diagnosis B. Therefore we are more likely to immediately treat the patient for diagnosis A. Artificial Intelligence utilises this system, however again with imputation of wide reaching data a robot would be able to more accurately frequency gamble whilst potentially being able to find sets of symptoms that were ‘hidden’ to the human doctor.
Medical training requires repetition which results in automated behaviour. The process of undertaking a task again and again allows the human doctor to perform a task efficiently, effectively and safely without having to engage cognitively. This is essential with the strain of being a physician however, can result in attentional slips and mistakes resulting in harm to the patient and harm to the doctor. Artificial Intelligence would function through an automated pattern relying on programmed ‘checks’ which could abort tasks before harm to the patient occurs, a further advancements which could improve patient care.
Applying artificial intelligence to empathy
Humans apply the above techniques to empathy. Mittelman, Markham and Taylor compassionately wrote about their own needs through their experiences of multiple sclerosis, end stage renal disease, mental illness, epilepsy, and Crohn’s disease, however I have never personally experienced any of these conditions and must still show empathy for patients who have these diagnoses. Doctors are often taught frameworks in order to do this, such as ‘SPIKES’ (Setting, Perception, Invitation, Knowledge, Empathy, Summary), and I must apply my training in, and experiences of, breaking bad news and difficult diagnoses to compassionately care. To do this I must draw on similar experiences and reflections of patient care to effectively communicate with patients and discuss subjects which are life changing. Artificial intelligence may be able to comfort patients based on vast imputed experience with a much more informed, experienced, and kind approach than I am able to.
Competing interests: I have read and understood BMJ policy on declaration of interests and have no relevant interests to declare.
Exclusive License: The Corresponding Author has the right to grant on behalf of all authors and does grant on behalf of all authors, a worldwide licence (http://www.bmj.com/sites/default/files/BMJ%20Author%20Licence%20March%20...) to the Publishers and its licensees in perpetuity, in all forms, formats and media (whether known now or created in the future), to i) publish, reproduce, distribute, display and store the Contribution, ii) translate the Contribution into other languages, create adaptations, reprints, include within collections and create summaries, extracts and/or, abstracts of the Contribution and convert or allow conversion into any format including without limitation audio, iii) create any other derivative work(s) based in whole or part on the on the Contribution, iv) to exploit all subsidiary rights to exploit all subsidiary rights that currently exist or as may exist in the future in the Contribution, v) the inclusion of electronic links from the Contribution to third party material where-ever it may be located; and, vi) licence any third party to do any or all of the above. All research articles will be made available on an open access basis (with authors being asked to pay an open access fee—see http://www.bmj.com/ about-bmj/resources-authors/forms-policies-and-checklists/copyright-open-access-and- permission-reuse). The terms of such open access shall be governed by a Creative Commons licence—details as to which Creative Commons licence will apply to the research article are set out in our worldwide licence referred to above.
1. Goldhahn Jörg, Rampton Vanessa, Spinas Giatgen A. Could artificial intelligence make doctors obsolete? BMJ 2018; 363:k4563
2. Reason J. Human error. Cambridge university press; 1990 Oct 26.
3. Mittelman Michael, Markham Sarah, Taylor Mark. Patient commentary: Stop hyping artificial intelligence—patients will always need human doctors BMJ 2018; 363 :k4669
4. Baile WF, Buckman R, Lenzi R, Glober G, Beale EA, Kudelka AP. SPIKES—a six-step protocol for delivering bad news: application to the patient with cancer. The oncologist. 2000 Aug 1;5(4):302-11.
Competing interests: No competing interests