Intended for healthcare professionals

Careers

Is IT the end for doctors?

BMJ 2016; 354 doi: https://doi.org/10.1136/bmj.i3306 (Published 11 July 2016) Cite this as: BMJ 2016;354:i3306
  1. Thomas Foley, academic clinical fellow in psychiatry
  1. Institute of health and society, University of Newcastle, Tyne and Wear NHS Foundation Trust
  1. thomas.foley{at}newcastle.ac.uk

Abstract

The increasing digitisation of medicine presents doctors with new opportunities, says Thomas Foley

There is increasing interest in the potential for big data and artificial intelligence to transform society. Medicine has not escaped attention, with the media,1 politicians,2 and investors3 claiming that technology will replace much of what doctors currently do.

What will change?

As part of a project looking at the potential of existing or near future technology in healthcare, we reviewed the literature and engaged almost 60 experts in seminars and interviews.4

We found evidence of technology being used within healthcare in potentially transformative ways:

  • Intelligent automation of routine aspects of care5 can free clinicians from administrative burdens.

  • Clinical decision support systems can help clinicians to deal with unfamiliar or high risk situations.6

  • Predictive models can identify instances where there is a high risk of low quality or unnecessarily expensive care,7 and where preventative action might be successful.8

  • Using outcomes can identify positive deviants (exceptional providers or clinicians) that can be studied to generate hypotheses about their performance that can be tested and disseminated.9

  • Real time surveillance systems can track epidemiological phenomena and adverse events related to treatments much more quickly than passive reporting systems.10

  • Comparative effectiveness research (CER) using routinely collected data can fill gaps in the evidence base more quickly and at lower cost than traditionally designed randomised controlled trials.11

What will this mean for doctors?

While the digitisation of healthcare might feel like another learning burden it may be the only way that clinicians can cope with the flow of more and more evidence. Arguably it is a natural development of the evidence based medicine movement.

Medical training increasingly emphasises asking the right questions and interpreting data instead of learning endless facts. At the same time, sifting large volumes of patient generated data—from smart phones, social media, and other sources—is likely to become an important skill. Technological developments should be reflected in curricula. Students and trainees should learn about what technology is capable of, so that they can challenge the provision in settings where they work.

There will also be new opportunities. More postgraduate qualifications in medical informatics are becoming available and many healthcare providers have appointed chief clinical information officers to bridge the gap between clinicians and IT teams. In the US these positions have been professionalised and attract dedicated time. In the UK moves are underway to establish a faculty of medical informatics.12

Few new technologies can simply be plugged into the existing health system. Benefits usually depend on re-engineering processes so that technology provides better ways of working. This practice is well established in other industries and some providers are recruiting experts from those fields. It will be important for doctors to have the quality improvement and leadership skills to participate in these redesigns.

Technology may also diminish the separation between clinical practice and research. Training curricula already emphasises research methods and many doctors follow a research career—in future the day-to-day work of all doctors may involve what we currently think of as research. Individual clinicians could have access to large datasets that are relevant to their patients. They will need to know how to ask the right questions of this data and how to interpret ambiguous answers to provide personalised care.11

Training, assessment, and revalidation of doctors could change radically as it becomes possible to monitor the performance of clinicians automatically. Training portfolios may become a thing of the past—but doctors will need new skills to reflect and act upon immediate feedback.

Ultimately doctors will need to adapt, as they always have, but we found little support for the view that they would be rendered obsolete any time soon. As one participant pointed out, “It is important not to forget the unique qualities a doctor brings to the relationship with the patient that cannot be offloaded to the IT side . . . Even as medicine changes, the doctor will still bring empathy, intuition, autonomy, and accountability to the relationship with patients.”13

Footnotes

  • ● See the Learning Healthcare Project website www.learninghealthcareproject.org for our final report, along with all of our interviews, seminars, and other source materials.

  • TF is special adviser to the Wachter Review of Healthcare IT in the NHS (unpaid appointment). He also received research funding from the Health Foundation for the Learning Healthcare project.

References