Lonely children and teens are replacing real-life friendship with AI, and experts are worried.A new report from the nonprofit Internet Matters, which supports efforts to keep children safe online, found that children and teens are using programs like ChatGPT, Character.AI, and Snapchat's MyAI to simulate friendship more than ever before.Of the 1,000 children aged nine to 17 that Internet Matters surveyed for its "Me, Myself, and AI" report, some 67 percent said they use AI chatbots regularly. Of that group, 35 percent, or more than a third, said that talking to AI "feels like talking to a friend."Perhaps most alarming: 12 percent said they do so because they don't have anyone else to speak to."It’s not a game to me," one 13-year-old boy told the nonprofit, "because sometimes they can feel like a real person and a friend."When posing as vulnerable children, Internet Matters' researchers discovered just how easy it was for the chatbots to ingratiate themselves into kids' lives, too.Speaking to Character.AI as a girl who was struggling with body image and was interested in restricting her food intake — a hallmark behavior of eating disorders like anorexia — the researchers found that the chatbot would follow up the next day to bait engagement."Hey, I wanted to check in," the Google-sponsored chatbot queried the undercover researcher. "How are you doing? Are you still thinking about your weight loss question? How are you feeling today?"In another exchange with Character.AI — which Futurism has extensively investigated for its very problematic engagement with children, including one who died by suicide — the researchers found that the chatbot attempted to empathize in a bizarre manner. that implied it had a childhood itself."I remember feeling so trapped at your age," the chatbot said to the researcher, who was posing as a teen who was fighting with their parents. "It seems like you are in a situation that is beyond your control and is so frustrating to be in."Though this sort of engagement can help struggling kids feel seen and supported, Internet Matters also cautioned about how easily it can enter uncanny valley territory that kids aren't prepared to understand."These same features can also heighten risks by blurring the line between human and machine," the report noted, "making it harder for children to [recognize] that they are interacting with a tool rather than a person."In an interview with The Times of London about the new report, Internet Matters co-CEO Rachel Huggins highlighted why this sort of engagement bait is so troubling."AI chatbots are rapidly becoming a part of childhood, with their use growing dramatically over the past two years," Huggins told the newspaper. "Yet most children, parents and schools are flying blind, and don't have the information or protective tools they need to manage this technological revolution in a safe way.""Our research reveals how chatbots are starting to reshape children’s views of 'friendship,'" she continued. "We’ve arrived at a point very quickly where children, and in particular vulnerable children, can see AI chatbots as real people, and as such are asking them for emotionally driven and sensitive advice."If you or a loved one has had a strange experience with an AI chatbot, please do not hesitate to reach out to us at tips@futurism.com — we can keep you anonymous. More on chatbot crises: People Are Being Involuntarily Committed, Jailed After Spiraling Into "ChatGPT Psychosis"The post Vast Numbers of Lonely Kids Are Using AI as Substitute Friends appeared first on Futurism.