Intended for healthcare professionals

Rapid response to:

Analysis

Using artificial intelligence to assess clinicians’ communication skills

BMJ 2019; 364 doi: https://doi.org/10.1136/bmj.l161 (Published 18 January 2019) Cite this as: BMJ 2019;364:l161

Rapid Response:

Artificial Intelligence Can Reproduce Gender and Racial Bias

Ryan et al. are right to highlight new methods of training and assessing doctors(1). As a profession we must always be prepared to embrace technology that promises to improve our practice. However caution is also required. Artificial intelligence is not synonymous with objectivity. It has been consistently shown that artificial intelligence, even when thought to be objectively programmed, imitates and reproduces gender and racial biases(2). There is no guarantee that AI will provide greater objectivity than a human assessor and such an assumption can not be automatically made.

(1) Ryan P, Luz S, Albert P, Vogel C, Normand C, Elwyn G. Using artificial intelligence to assess clinicians' communication skills. BMJ. 2019 Jan 18;364:l161
(2) Caliskan A, Bryson JJ, Narayanan A. Semantics derived automatically from language corpora contain human-like biases. Science. 2017 Apr 14;356(6334):183-186

Competing interests: No competing interests

31 January 2019
Chika E Uzoigwe
Doctor
Sanchez Franco LC, Shabani F
Harcourt House, Sheffield