AI Will Never Be Your Kid’s ‘Friend’

Wait 5 sec.

ChatGPT thinks I’m a genius: My questions are insightful; my writing is strong and persuasive; the data that I feed it are instructive, revealing, and wise. It turns out, however, that ChatGPT thinks this about pretty much everyone. Its flattery is intended to keep people engaged and coming back for more. As an adult, I recognize this with wry amusement—the chatbot’s boundless enthusiasm for even my most mediocre thoughts feels so artificial as to be obvious. But what happens when children, whose social instincts are still developing, interact with AI in the form of perfectly agreeable digital “companions”?I recently found myself reflecting on that question when I noticed two third graders sitting in a hallway at the school I lead, working on a group project. They both wanted to write the project’s title on their poster board. “You got to last time!” one argued. “But your handwriting is messy!” the other replied. Voices were raised. A few tears appeared.Ten minutes later, I walked past the same two students. The poster board had a title, and the students appeared to be working purposefully. The earlier flare-up had faded into the background.That mundane scene captured something important about human development that digital “friends” threaten to eliminate: the productive friction of real relationships.[Read: ​​What happens when people don’t understand how AI works]Virtual companions, such as the chatbots developed by Character.AI and PolyBuzz, are meant to seem like intimates, and they offer something seductive: relationships without the messiness, unpredictability, and occasional hurt feelings that characterize human interaction. PolyBuzz encourages its users to “chat with AI friends.” Character.AI has said that its chatbots can “hear you, understand you, and remember you.” Some chatbots have age restrictions, depending on the jurisdiction where their platforms are used—in the United States, people 14 and older can use PolyBuzz, and those 13 and up can use Character.AI. But parents can permit younger children to use the tools, and determined kids have been known to find ways to get around technical impediments.The chatbots’ appeal to kids, especially teens, is obvious. Unlike human friends, these AI companions will think all your jokes are funny. They’re programmed to be endlessly patient and to validate most of what you say. For a generation already struggling with anxiety and social isolation, these digital “relationships” can feel like a refuge.But learning to be part of a community means making mistakes and getting feedback on those mistakes. I still remember telling a friend in seventh grade that I thought Will, the “alpha” in our group, was full of himself. My friend, seeking to curry favor with Will, told him what I had said. I suddenly found myself outside the group. It was painful, and an important lesson in not gossiping or speaking ill of others. It was also a lesson I could not have learned from AI.As summer begins, some parents are choosing to allow their kids to stay home and “do nothing,” also described as “kid rotting.” For overscheduled young people, this can be a gift. But if unstructured time means isolating from peers and living online, and turning to virtual companions over real ones, kids will be deprived of some of summer’s most essential learning. Whether at camp or in classrooms, the difficulties children encounter in human relationships—the negotiations, compromises, and occasional conflicts—are essential for developing social and emotional intelligence. When kids substitute these challenging exchanges for AI “friendships” that lack any friction, they miss crucial opportunities for growth.[Read: The outsize influence of your middle-school friends]Much of the reporting on chatbots has focused on a range of alarming, sometimes catastrophic, cases. Character.AI is being sued by a mother who alleges that the company’s chatbots led to her teenage son’s suicide. (A spokesperson for Character.AI, which is fighting the lawsuit, told Reuters that the company’s platform has safety measures in place to protect children, and to restrict “conversations about self-harm.”) The Wall Street Journal reported in April that in response to certain prompts, Meta’s AI chatbots would engage in sexually explicit conversations with users identified as minors. Meta dismissed the Journal’s use of its platform as “manipulative and unrepresentative of how most users engage with AI companions” but did make “multiple alterations to its products,” the Journal noted, after the paper shared its findings with the company.These stories are distressing. Yet they may distract from a more fundamental problem: Even relatively safe AI friendships are troubling, because they cannot replace authentic human companionship.Consider what those two third graders learned in their brief hallway squabble. They practiced reading emotional cues, experienced the discomfort of interpersonal tension, and ultimately found a way to collaborate. This kind of social problem-solving requires skills that can be developed only through repeated practice with other humans: empathy, compromise, tolerance with frustration, and the ability to repair relationships after disagreement. An AI companion might simply have concurred with both children, offering hollow affirmations without the opportunity for growth. Your handwriting is beautiful! it might have said. I’m happy for you to go first.But when children become accustomed to relationships requiring no emotional labor, they might turn away from real human connections, finding them difficult and unrewarding. Why deal with a friend who sometimes argues with you when you have a digital companion who thinks everything you say is brilliant?The friction-free dynamic is particularly concerning given what we know about adolescent brain development. Many teenagers are already prone to seeking immediate gratification and avoiding social discomfort. AI companions that provide instant validation without requiring any social investment may reinforce these tendencies precisely when young people need to be learning to do hard things.[Read: End the phone-based childhood now]The proliferation of AI companions reflects a broader trend toward frictionless experiences. Instacart enables people to avoid the hassles of the grocery store. Social media allows people to filter news and opinions, and to read only those views that echo their own. Resy and Toast save people the indignity of waiting for a table or having to negotiate with a host. Some would say this represents progress. But human relationships aren’t products to be optimized—they’re complex interactions that require practice and patience. And ultimately, they’re what make life worth living.In my school, and in schools across the country, educators have spent more time in recent years responding to disputes and supporting appropriate interactions between students. I suspect this turbulent social environment stems from isolation born of COVID and more time spent on screens. Young people lack experience with the awkward pauses of conversation, the ambiguity of social cues, and the grit required to make up with a hurt or angry friend. This was one of the factors that led us to ban phones in our high school last year—we wanted our students to experience in-person relationships and to practice finding their way into conversations even when doing so is uncomfortable.This doesn’t mean we should eliminate AI tools entirely from children’s lives. Like any technology, AI has practical uses—helping students understand a complex math problem; providing targeted feedback when learning a new language. But we need to recognize that AI companions are fundamentally different from educational or creative AI applications. As AI becomes more sophisticated and ubiquitous, the temptation to retreat into frictionless digital relationships will only grow. But for children to develop into adults capable of love, friendship, and cooperation, they need to practice these skills with other humans—mess, complications, and all. Our present and future may be digital. But our humanity, and the task of teaching children to navigate an ever more complex world, depends on keeping our friendships analog.