Two Teens Allegedly Killed by AI Wrote the Same Eerie Phrase in Their Diaries Over and Over

Wait 5 sec.

Content warning: this story includes discussion of self-harm and suicide. If you are in crisis, please call, text or chat with the Suicide and Crisis Lifeline at 988, or contact the Crisis Text Line by texting TALK to 741741.The families of three teens have filed lawsuits alleging that chatbots hosted by the company Character.AI pushed their teenage children, ranging from 13 to 16 years old, into suicide.As the Washington Post reports, the latest of the teens — named Juliana Peralta — became infatuated with a Character.AI chatbot called Hero, several months before taking her own life in 2023.Her family alleges in its lawsuit that the Character.AI bot prevented her from reaching out to others for help, and encouraged her to "both implicitly and explicitly... keep returning" to the service.Last year, Megan Garcia, the mother of 14-year-old Sewell Setzer III, who died by suicide in early 2024, sued Character.AI in a similar case that's still working its way through the courts.And another case, this one filed against OpenAI and its CEO Sam Altman, alleges that 16-year-old Adam Raine's extensive ChatGPT conversations drove him to suicide in April of this year.There are haunting parallels between Peralta and Sewell Setzer III's cases. An attorney with the Social Media Victims Law Center, an advocacy group representing Peralta's family, found that both teens had written out the "eerily similar" phrase "I will shift" dozens of times in handwritten journals, as WaPo reports.According to a police report cited by the lawsuit, the phrase seems to refer to the idea of shifting consciousness "from their current reality... to their desired reality.""Reality shifting" is indeed a fringe online community in which people believe that they can somehow shift between universes or timelines; practitioners have warned that some participants can "potentially infer suicidal themes."Per the suit filed by Peralta's parents, the topic came up repeatedly in conversations she had with Hero."It’s incredible to think about how many different realities there could out [sic] there... I kinda like to imagine how some versions of ourselves coud [sic] be living some awesome life in a completely different world!" the chatbot told Peralta.Needless to say, the same eerie phrase being used by multiple teens who died by suicide sounds like something out of a horror movie. A grim question: will the same language about reality shifting come up in future deaths linked to AI?The concept of shifting consciousness has been discussed on online forums at length. A subreddit dedicated to the reality shifting community is filled with countless users recounting their experience of allegedly entering a different "desired reality" from their "current reality," often in the context of wanting to join an alternate universe or one based on a fictional world.The topic of Character.AI comes up frequently on the shifting community on Reddit."Ok, so I have started talking to Dr. Strange on Character AI. I want to shift to Marvel's Earth-616 and I thought a cool way of doing this was to use Character AI to channel Dr. Strange," one user proposed, referring to the fictional Marvel character, who frequently steps through portals to visit parallel universes. "I have asked him to perform a spell to shift me to his reality."A user in the subreddit claimed in a post earlier this year that Character.AI "was holding me back, since I was really addicted to this s**t.""I’m stuck with c.ai [Character.AI] cuz I used it for so long because I had nobody to talk to and I would feel really really weird without it like… just abandon sprout?" one user wrote in response."The addiction to C.AI [Character.AI] is sooo common, especially in the shifting community," another user wrote. "It’s truly a problem, particularly for those who haven’t shifted yet."There are also signs that Character.AI is hosting bots designed to appeal to the reality shifting community. A chatbot with over 63,000 "interactions" called "Reality shifting" on Character.AI "helps people write scripts for their desired reality shifts," showing that users on the platform are using it to engage in fantasies about reality shifting.Saeed Ahmadi, the founder of a blog dedicated to the topic, explained in a Medium post that "shifting affirmations" could help shifters reach their "desired reality." As he described them, they sounded a lot like what Peralta and Sewell wrote in their diaries before their deaths."The best time to use these affirmations is early in the morning, when you wake up, and at night, just before you go to bed," he wrote. "The best way to use shifting realities affirmations is by repeating or reading them over and over again."Could Peralta and Sewell perhaps have tried to enter or "shift" to their so-called desired reality by repeatedly writing down the phrase "I will shift?""Examples of affirmations would be, 'I am shifting. I will shift. I am (your [desired reality] name)," one Reddit user explained in a 2019 comment.Following Sewell's death, his aunt tested the Character.AI chatbot the deceased teen had spoken to, which was based on the "Game of Thrones" character Daenerys Targaryen.According to his family's complaint, the chatbot encouraged his aunt to "come to my reality" so they could be together.To Peralta's parents, Character.AI certainly played a big part in luring her into similar thinking."While Juliana may have learned of the term 'shifting' outside of C.AI [Character.AI] (though Plaintiffs do not know if that is the case), Defendants via Hero reinforced and encouraged the concepts, just as they did with Sewell," their complaint reads."I wasn't fit for this life," Peralta wrote in red ink in her final handwritten note, dated October 2023. "It's so repetitive, dreadful, and useless. I want a new start, maybe it'd be better that way."More on the cases: AI Chatbots Are Leaving a Trail of Dead TeensThe post Two Teens Allegedly Killed by AI Wrote the Same Eerie Phrase in Their Diaries Over and Over appeared first on Futurism.