Letters
In praise of boring AI
Young people and other vulnerable groups are overlooked in AI algorithm development
BMJ 2024; 386 doi: https://doi.org/10.1136/bmj.q1883 (Published 03 September 2024) Cite this as: BMJ 2024;386:q1883- Ediriweera Desapriya, research associate1,
- Siyath Ariyaratne, undergraduate student2,
- Crystal Ma, MD undergraduate student3,
- Hasara Illuppella, student researcher1,
- Peter Tiu, research assistant1,
- Dinidu Akalanka Wijesinghe, research assistant1,
- Shaluka Manchanayake, critical care physician4,
- Jay Herath, professor5
- 1Department of Pediatrics, Faculty of Medicine, University of British Columbia, Vancouver, BC, Canada
- 2Faculty of Science, Simon Fraser University, Burnaby, BC, Canada
- 3Faculty of Medicine, University of British Columbia, Vancouver, BC, Canada
- 4National Hospital (Teaching) Kandy, William Gopallawa Mawatha, Kandy, Central Province, Sri Lanka
- 5Department of Chemistry and Biochemistry, Loyola University, New Orleans, LA, USA
- edesap{at}mail.ubc.ca
Abbasi’s article on biases in AI clinical tools highlights that they often overlook the perspectives of younger people—an essential segment of our society.1 This oversight extends beyond healthcare; similar problems arise in fields like transportation safety.
AI powered autonomous vehicles, for example, have been shown to poorly recognise child pedestrians. This problem stems from AI models being trained on biased …
Log in
Log in using your username and password
Log in through your institution
Subscribe from £184 *
Subscribe and get access to all BMJ articles, and much more.
* For online subscription
Access this article for 1 day for:
£50 / $60/ €56 (excludes VAT)
You can download a PDF version for your personal record.