This article examined how AI can unintentionally reproduce social and demographic biases when applied to mental health prediction. Using benzodiazepine prescriptions as a proxy for conditions such as depression and anxiety, a study analyzed machine learning models trained on patient data to identify systematic disparities.
It found that women are more frequently predicted to receive such treatments, reflecting gender bias, while the models perform less accurately for minority ethnic groups, indicating representation and evaluation bias. The AI models here are not used to prescribe drugs but rather to predict treatment likelihoods, revealing how bias in healthcare data can lead to inequitable AI performance in the context of depression-related care.
Learn more about the article here: https://doi.org/10.3390/info13050237
Reference
Mosteiro, P.J., Kuiper, J., Masthoff, J., Scheepers, F., & Spruit, M. (2022). Bias Discovery in Machine Learning Models for Mental Health. Inf., 13, 237.
