December 4, 2024

AI tool can help improve quality of 113 counselling

PhD candidate Salim Salmi, based at the Amsterdam Science Park, has developed an AI tool to assist 113 Suicide Prevention counselors during chat interactions with individuals contemplating suicide. He is currently investigating the potential implementation of this digital assistant within the organization. Salmi conducted his research at the Centrum Wiskunde & Informatica and will receive his doctorate from Vrije Universiteit Amsterdam on December 4.

The 113 helpline allows people with suicidal thoughts to express their feelings and concerns anonymously, with the ultimate goal of preventing suicide. In 2023, almost 185,000 calls were made by phone and chat, 21.5% more than the year before. 113 is constantly looking for new ways to improve its service.

For his PhD, Salim Salmi worked on a tool that uses artificial intelligence to analyze online chats and then advises the counsellor on how to continue the conversation. But such a tool needs to ‘know’ which conversations are effective and which are not. This was one of the key questions in Salmi’s research, and also a big challenge: “We have a lot of text data, but we can’t link it to outcomes. Because people contact us anonymously, you don’t know how they feel after the conversation. So we asked them to fill in a questionnaire before and after the interview. This questionnaire looks at the presence of characteristics that might indicate suicidal behaviour. It gives you a score. Do we see a change in that after the interview?

By collecting such data for years, a picture has emerged of which interviews lead to a ‘good’ outcome and which do not. Using this information, Salmi trained an AI model to find out which parts of the interview improved or deteriorated. He designed the model so that he could look back and see which sentences the model considered normative for the outcome of a conversation.

Digital assistant

The next step was to develop a digital assistant that can make suggestions to the counsellor during a chat conversation. The model searches its database of successful conversations for texts applicable to the current chat and makes a suggestion. In this way, helpers are shown examples of conversations that have actually taken place and decide for themselves whether to use them.

The results

The digital assistant has been extensively tested. First in a group of 24 counsellors, then in a randomised study in which 27 counsellors were given the tool and 21 were not. Salmi: “They decided for themselves when to consult the assistant and what to do with the suggestions.” The results: “Conversations with the AI tool lasted slightly shorter on average. In terms of self-efficacy – the belief in one’s own ability to successfully complete a task – the advisors reported little difference. They chose to use the AI assistant mainly in difficult conversations where they did not know how to get through to the person.”

The next step is to extend the model from chat to phone calls. Now that his doctoral research is complete, Salmi is embarking on a new project: integrating his tool into the platform people use to ask for help on 113.

Related news

How can we help you?

Looking for partners to collaborate. Or looking for a certain expertise? Or would you like to locate your business in the Amsterdam Science Park? Drop us a line and we help you to find a perfect match.

Leo le Duc Science & Business
For business inquiries contact

Leo Le Duc

Let's connect