How precision medicine and screening with big data could increase overdiagnosisBMJ 2019; 366 doi: https://doi.org/10.1136/bmj.l5270 (Published 13 September 2019) Cite this as: BMJ 2019;366:l5270
All rapid responses
This important and well written analysis focuses on some of the major obstacles that preventive personal medicine must overcome to fulfil its potential. Two further obstacles should be brought to attention.
First, poor quality and erroneous data have great implications for machine learning algorithms used in preventive personal medicine. This holds true for historical data used to train predictive algorithms as well as for new data used to make future predictions. As anyone doing clinical research will know, collecting accurate data is a highly laborious process and even in a research setting erroneous data are common. Precision medicine relies quite heavily on real life data from sources such as electronic health records (EHR) where inaccurate data are abundant. When using large datasets, it is possible to show that indicators such as a diagnosis or a symptom have harmful or beneficial effects at the group level. In large datasets errors such as a false diagnoses or incorrectly recorded symptoms are unlikely to matter much, as random errors will be distributed evenly across the population. However, when you refocus from group level to individual level, it is possible for such errors to greatly influence algorithm predictions. Furthermore, there is some indication that even minor changes to training data can cause considerable changes and instabilities to machine learning algorithms. (1)
Secondly, involvement of the pharmaceutical industry in the development of machine learning algorithms to be used in a clinical setting, can cause bias to be built into the algorithms. As an example, the pharmaceutical industry is presently involved in the development of EHRs and automated decision making in Denmark. (2, 3) This takes place despite the fact, that according to Danish law, medical advice for patients must be financially independent from the pharmaceutical industry. The Danish Health Act stipulates that doctors, dentists and pharmacists cannot operate or be affiliated with a pharmaceutical company without the permission of the Danish Medicines Agency. To prevent pharmaceutical companies from biasing AI, and thus personal medicine, in support of own product sales, the same requirements should be applied when formulating machine learning algorithms.
1) Regneri M, Hoffmann M, Kost J et al. Analyzing Hypersensitive AI: Instability in Corporate-Scale Machine Learning. Proceedings of the 2nd Workshop on Explainable Artificial Intelligence (XAI 2018). arXiv:1807.07404
Competing interests: No competing interests