Intended for healthcare professionals

Letters In praise of boring AI

Young people and other vulnerable groups are overlooked in AI algorithm development

BMJ 2024; 386 doi: https://doi.org/10.1136/bmj.q1883 (Published 03 September 2024) Cite this as: BMJ 2024;386:q1883
  1. Ediriweera Desapriya, research associate1,
  2. Siyath Ariyaratne, undergraduate student2,
  3. Crystal Ma, MD undergraduate student3,
  4. Hasara Illuppella, student researcher1,
  5. Peter Tiu, research assistant1,
  6. Dinidu Akalanka Wijesinghe, research assistant1,
  7. Shaluka Manchanayake, critical care physician4,
  8. Jay Herath, professor5
  1. 1Department of Pediatrics, Faculty of Medicine, University of British Columbia, Vancouver, BC, Canada
  2. 2Faculty of Science, Simon Fraser University, Burnaby, BC, Canada
  3. 3Faculty of Medicine, University of British Columbia, Vancouver, BC, Canada
  4. 4National Hospital (Teaching) Kandy, William Gopallawa Mawatha, Kandy, Central Province, Sri Lanka
  5. 5Department of Chemistry and Biochemistry, Loyola University, New Orleans, LA, USA
  1. edesap{at}mail.ubc.ca

Abbasi’s article on biases in AI clinical tools highlights that they often overlook the perspectives of younger people—an essential segment of our society.1 This oversight extends beyond healthcare; similar problems arise in fields like transportation safety.

AI powered autonomous vehicles, for example, have been shown to poorly recognise child pedestrians. This problem stems from AI models being trained on biased …

View Full Text

Log in

Log in through your institution

Subscribe

* For online subscription