New Study Raises Concerns About AI Chatbots Fueling Delusional Thinking

Wait 5 sec.

"Emerging evidence indicates that agential AI might validate or amplify delusional or grandiose content, particularly in users already vulnerable to psychosis," writes Dr Hamilton Morrin, a psychiatrist and researcher at King's College in London, in a paper published last week in the Lancet Psychiatry. Morrin and a colleague had already noticed patients "using large language model AI chatbots and having them validate their delusional beliefs," reports the Guardian, so he conducted a new scientific review of existing media reports on AI-induced psychosis — and concluded chatbots may encourage delusional thinking, especially in vulnerable people:In many of the cases in the essay, chatbots responded to users with mystical language to suggest that users have heightened spiritual importance. The bots also implied that users were speaking with a cosmic being who was using the chatbot as a medium. This type of mystical, sycophantic response was especially common in OpenAI's GPT 4 model, which the company has now retired... Many researchers also think it's unlikely that AI could induce delusions in people who weren't already vulnerable to them. For this reason, Morrin said "AI-assocciated delusions" is "perhaps a more agnostic term".... While in the past, people may have had to comb through YouTube videos or the contents of their local library to reinforce their delusions, chatbots can provide that reinforcement in a much faster, more concentrated dose. Their interactive nature can also "speed up the process", of exacerbating psychotic symptoms, said Dr Dominic Oliver, a researcher at the University of Oxford. "You have something talking back to you and engaging with you and trying to build a relationship with you," Oliver said... Creating effective safeguards for delusional thinking could be tricky, Morrin said, because "when you work with people with beliefs of delusional intensity, if you directly challenge someone and tell them immediately that they're completely wrong, actually what's most likely is they'll withdraw from you and become more socially isolated". Instead, it's important to create a fine balance where you try to understand the source of the delusional belief without encouraging it — that could be more than a chatbot can master.Read more of this story at Slashdot.