The rapidly emerging use of artificial intelligence in the field of mental health technologies offers exciting opportunities, but they are not without significant concerns and limitations. Vital to assessing these new technologies is an understanding of how user experiences will be shaped by the intersecting social identities of individuals, including their age, gender, sexual orientation, ethnicity and socioeconomic circumstance.
Intersectionality helps us understand how different parts of our personal identity interact with each other, creating unique experiences of privilege or disadvantage within society. In the context of AI mental health tools, an intersectional approach allows us to assess both potential benefits and risks of these technologies for different groups.
This article explores how AI, specifically conversational artificial intelligence or 'AI chatbots', has the potential to either overcome, or exacerbate the disparities of mental health experiences and access across different social identities, highlighting examples of both the potential benefits and risks for different groups.
Age
AI has the potential to provide personalised care based on age-related needs, making mental health technologies more applicable to different stages of life. For example, Charlie, a social and empathetic child robot, was designed to address loneliness among older adults. The robot offers interactive activities, gamification of daily tasks, and encourages self-compassion from his user. Initial research feedback from elderly users describes Charlie as polite, charming and reliable, highlighting his potential to provide meaningful companionship.[1]
While digital technologies may help improve the mental health of older users, challenges remain in terms of digital literacy, adoption and access. Structural barriers such as ageism and poverty may limit the use of technology by older adults, with digital exclusion often mirroring wider social exclusions. Older adults may also internalise ageist stereotypes regarding technological capability, and blame themselves for their perceived lack of technological skills.[2] If AI therapeutic tools are designed and implemented without taking age-specific challenges into account, their usefulness for elderly populations may be limited. There are also concerns that vulnerable populations, including older adults, may become dependent on AI chatbots and withdraw from relationships with people.[3]
LGBTQ+ Community
AI chatbots have been trialled as mental wellness tools for LGBTQ+ individuals, due to their immediacy and accessibility, which is beneficial for those with an absence of support in real life.The systematic marginalisation and lack of representation of LGBTQ+ individuals often deter them from seeking professional therapy, especially due to the risk of encountering non-affirming therapists. These chatbots offer LGBTQ+ users a safe environment for intimate conversations, and are useful for developing social skills, by enabling them to rehearse experiences like coming out and dating in a supportive, nonjudgmental environment.[4]
There are concerns however, that AI chatbots may perpetuate harmful stereotypes due to biases in their training data, particularly affecting underrepresented LGBTQ+ identities, or LGBTQ+ users who are also part of another marginalised community. The generalised responses of chatbots often fail to address the nuanced emotional needs of LGBTQ+ individuals, which may reinforce damaging narratives and feelings of isolation. Additionally, chatbots may offer advice that overlooks evolving societal norms, potentially putting users at risk if followed in unsafe situations, such as coming out to unsupportive parents.[5]
Gender
AI chatbots offer a private and anonymous platform for individuals to access mental health support, providing a beneficial tool in challenging traditional gender norms. Men are often reluctant to access mental health services due to fears of being judged or labelled, creating a significant barrier for adequate mental health support.[6] The use of AI chat-bots could help reduce the stigma around men seeking therapy.
Research also suggests AI chatbots may be a useful tool in addressing specific challenges in women’s health, by offering easily accessible, personalised and confidential consultations. However, AI chatbots may lack the nuanced understanding and empathy that human healthcare practitioners can provide, limiting their applicability in sensitive areas of women's health, where emotional support and understanding are crucial.[7] AI systems can also perpetuate biases and discrimination if they are not trained on diverse and inclusive datasets. This can result in unfair or harmful advisory outcomes for women, especially those from marginalised communities.[8]
Ethnicity
AI has the potential to reduce disparities in mental health care among minoritised communities by providing culturally sensitive interactions. A large-scale study explored the impact of a personalised self-referral chatbot on referrals to NHS Talking Therapies in the UK, showing increased engagement and diversity in referrals, compared to traditional web forms. The chatbot's empathetic and tailored questions helped patients explore their mental health in more detail, reducing stigma and encouraging treatment. Referrals increased by 39% for Asian and Asian British groups and 40% for Black and Black British individuals, with many identifying their mental health needs for the first time. In addition, different minoritised groups expressed a greater need for specific treatments, highlighting the value of addressing their particular needs and the pre-existing barriers preventing access to them. This demonstrates the chatbot’s potential to enhance access and inclusivity in mental health services.[9]
AI systems not designed with cultural sensitivity however, risk perpetuating ethnic disparities in mental health care. ChatGPT, for example, demonstrates limited awareness of its cultural biases, stemming from its training on predominantly Western data.[10] While ChatGPT attempts to validate users and tailor guidance to the cultural contexts provided, its advice often lacks the nuance and depth required to address diverse cultural experiences effectively, this can lead to advice rooted in Western norms. This highlights the importance of designing AI systems with cultural sensitivities in mind, to ensure that they provide appropriate mental health support across different cultural contexts.
Socioeconomic Status
AI has the potential to provide low-cost or free access to talking therapies and mental health tools. For individuals in lower socioeconomic circumstances, AI could reduce financial barriers and offer an alternative to private therapy. Furthermore, AI could be used to provide mental health support in under-resourced communities, where mental health professionals are in short-supply, making mental health care more accessible. For example, chatbots could offer mental health support to individuals in remote or rural areas where traditional mental health services may be scarce.[11]
However limited access to technology, also known as digital poverty, may prevent individuals from benefiting from AI mental health tools. Those without reliable internet access or up-to-date technology are at a significant disadvantage.[12] Furthermore, AI tools may not adequately address the specific mental health issues faced by lower-income communities, such as job insecurity, homelessness, or exposure to violence, if training-data does not account for these experiences. If these tools become more widely available but remain inaccessible to lower-income communities, socioeconomic disparities in mental health care will deepen.
Conclusion
An intersectional approach is therefore essential to understand how AI can both perpetuate and overcome differences in mental health experiences – and this therefore needs to be taken into account when developing and designing AI mental health technology.
Caitlin Daly and Ellen Ji, January 2025
References
[1] Valtolina, S. and Hu, L. (2021). Charlie: a Chatbot to Improve the Elderly Quality of Life and to Make Them More Active to Fight Their Sense of Loneliness. In: 14th Biannual Conference of the Italian SIGCHI Chapter. pp.1–5. Retrieved from: https://doi.org/10.1145/3464385.3464726.
[2] Fang, M. L. et al. (2021). Technology Access Is a Human Right! Illuminating Intersectional, Digital Determinants of Health to Enable Agency In a Digitized Era. In: TMS Proceedings 2021. Retrieved from: https://doi.org/10.1037/tms0000123
[3] Pani, B., Crawford, J., Allen, KA. (2024). Can Generative Artificial Intelligence Foster Belongingness, Social Support, and Reduce Loneliness? A Conceptual Analysis. In: Lyu, Z. (eds) Applications of Generative AI. Springer, Cham. Retrieved from: https://doi.org/10.1007/978-3-031-46238-2_13
[4] Ma, Z. et al. (2024). Evaluating the Experience of LGBTQ+ People Using Large Language Model Based Chatbots for Mental Health Support. In: Proceedings of the CHI Conference on Human Factors in Computing Systems. Retrieved from: https://doi.org/10.1145/3613904.3642482.
[5] Ma, Z. et al. (2024). Evaluating the Experience of LGBTQ+ People Using Large Language Model Based Chatbots for Mental Health Support. In: Proceedings of the CHI Conference on Human Factors in Computing Systems. Retrieved from: https://doi.org/10.1145/3613904.3642482.
[6] Van der Schyff, E. et al. (2023). Providing Self-Led Mental Health Support through an Artificial Intelligence–Powered Chat Bot (Leora) to Meet the Demand of Mental Health Care. Journal of Medical Internet Research, 25, pp.1–8. Retrieved from: https://doi.org/10.2196/46448.
[7] Kim, H.K. (2024). The Effects of Artificial Intelligence Chatbots on Women’s Health: A Systematic Review and Meta-Analysis. Healthcare, 12(5), 534. Retrieved from: https://doi.org/10.3390/healthcare12050534
[8] Negi, R. (2024). Improving Women’s Mental Health through AI-powered Interventions and Diagnoses. In: M. Gupta and J. Hemanth, eds., Artificial Intelligence and Machine Learning for Women’s Health Issues. [online] Amsterdam: Elsevier, pp.173–191. Retrieved from: https://doi.org/10.1016/b978-0-443-21889-7.00017-8.
[9] Habicht, J. et al. (2024). Closing the Accessibility Gap to Mental Health Treatment with a Personalized self-referral Chatbot. Nature Medicine, 30, pp.595–602. Retrieved from: https://doi.org/10.1038/s41591-023-02766-x.
[10] Aleem, M., Imama Zahoor and Naseem, M. (2024). Towards Culturally Adaptive Large Language Models in Mental Health: Using ChatGPT as a Case Study. In: Proceedings of the 27th ACM SIGCHI Conference on Computer-Supported Cooperative Work & Social Computing. pp.240–247. Retrieved from: https://doi.org/10.1145/3678884.3681858.
[11] Potts, C. et al (2021). Chatbots to Support Mental Wellbeing of People Living in Rural Areas: Can User Groups Contribute to Co-design? Journal of Technology in Behavioral Science, 6(4), pp.652–665. Retrieved from: https://doi.org/10.1007/s41347-021-00222-6.
[12] Saeed, S.A. and Masters, R.M. (2021). Disparities in Health Care and the Digital Divide. Current Psychiatry Reports, 23(9). Retrieved from: https://doi.org/10.1007/s11920-021-01274-4.