Research study uses AI chatbot to discuss sexual health screening
- 18 February 2025

- Researchers at the University of Westminster have developed an AI chatbot to discuss sensitive topics such as sexual health screening
- The Chatbot-Assisted Self-Assessment (CASA) is designed for ethnically diverse communities with the aim of reducing health inequalities
- It uses conversational AI to provide users with personalised health assessments and actionable recommendations
Researchers at the University of Westminster have developed a chatbot for discussing sensitive topics, which they believe could help reduce health inequalities.
The study, published in the journal PLOS Digital Health on 13 February 2025, introduces the Chatbot-Assisted Self-Assessment (CASA) strategy as a culturally sensitive strategy to overcome barriers such as stigma, discrimination and limited access to healthcare.
Study participants expressed comfort in disclosing anonymous sensitive health information about sexual health screening to secure chatbots, showing their potential as supplementary tools for health education and self-assessment.
Designed specifically for ethnically diverse communities, the CASA strategy uses conversational AI to provide users with personalised health assessments and actionable recommendations.
Dr Tom Nadarzynski, who led the study at the University of Westminster, said: “The CASA protocol demonstrates how AI can be co-designed with diverse communities to enhance engagement, trust, and accessibility in healthcare.
“By ensuring that chatbots are inclusive, we can tackle longstanding health inequalities.”
The CASA protocol was co-developed with input from underrepresented groups, ensuring its design aligns with the needs of ethnically diverse communities.
Researchers surveyed 1,287 participants and conducted 41 follow-up interviews in 2022 to understand attitudes towards using chatbots for health-related conversations.
The study found that chatbots which provided explanations for medical inquiries used in self-assessment were deemed appropriate by participants for discussing sensitive health issues, including sexual health screening, and emphasised the importance of anonymity and trust in AI systems.
Conversational AI tools utilising the CASA protocol can therefore help users who are reluctant to discuss sensitive health issues with healthcare professionals to access medical care.
While the study was initially applied to sexual health, the CASA protocol is adaptable to other areas, including chronic disease management and mental health support, and holds potential for wider healthcare applications to address critical health disparities.
There were concerns raised in the study about chatbots lacking human empathy and being unable to handle complex emotional issues.
To improve trust, users recommended that chatbots provide clear explanations about data security, use simple and inclusive language, and offer translations in multiple languages.
They also suggested that chatbots should provide clear steps for follow-up actions, such as booking appointments or ordering home testing kits.
The study was funded by the NHS AI Lab and The Health Foundation.
Meanwhile, in November 2023 a research study found that using an AI chatbot can offer significant emotional and mental support to pre- and postnatal women and help to reduce the severity of depressive symptoms.
The study, published in Frontiers in Global Women’s Health, looked at mothers who were using the AI mental health app Wysa.
It found that mothers who were highly engaged with the app saw a 12.7% reduction in depressive symptoms and many who were highly engaged transitioned from ‘moderately severe depression’ to ‘moderate depression’ as a result of using the AI chatbot.