The Cambridge Dictionary has crowned ‘parasocial’ as 2025’s ‘Word of the Year’, pointing to an increased interest in the term this year, and tying it to the “rise in popularity of AI (artificial intelligence) companions”.The timing couldn’t be more telling. As loneliness tightens its grip worldwide— affecting one in 16 people globally, according to the World Health Organisation — we are witnessing a profound shift in how humans seek connection. AI chatbots have evolved from novelties into necessities, seamlessly filling different roles: work assistants, therapists, confidants, and even romantic partners.Cambridge captured this cultural moment, updating the definition of ‘parasocial’ in September 2025 to include the “possibility of a relationship with AI”.What does ‘parasocial’ mean?Sociologists Donald Horton and R Richard Wohl coined the word in their 1956 paper titled ‘Mass Communication and Para-Social Interaction’. The word combines the term ‘social’ with the Greek-derived prefix ‘para’, which means “closely resembling” or “almost”.Horton and Wohl used the term to describe the relationship viewers form with media personalities they view on television. They called it a “seeming face-to-face relationship between spectator and performer”.The interaction, characterised as “one-sided”, is illusory, where the “fan comes to believe that he ‘knows’ the persona more intimately and profoundly than others do”.It’s why some hunt for easter eggs in Taylor Swift’s discography, convinced that they offer glimpses into her real-life experiences. Or why many partake in celebrity face-offs. Or in extreme cases, turn into their stalkers. Some even have parasocial relationships with fictional characters.Story continues below this adResearchers have also traced membership to extreme ideologies and political alignments to parasocial relationships, which fulfil the need for belonging.Parasocial interactions in the digital, AI ageThe internet has dramatically amplified parasocial relationships. The celebrities, or ‘performers’ so to speak, are closer than ever, quite literally in the palm of our hands, accessible across a plethora of devices. Their performances are available at any time of the day.With social media, fans believe they have greater access and intimate knowledge of the celebrity. The term parasocial now encompasses online celebrities, such as YouTubers, influencers, or commentators. Case in point: Cambridge noted a spike in search interest for the term ‘parasocial’ in June 2025 after YouTube streamer IShowSpeed blocked a fan who described themselves as his “number 1 parasocial” in an elaborate X thread about his break-up.Also Read | Why parasocial beat AI slop and delulu for Word of the YearEven more so, podcasts are increasingly designed with an intimate format, giving the illusion of friends chatting. Online discourse on Reddit or X dissect the private lives of celebrities, where users can fawn or hate over them, and boast of intimate details (an undisclosed fan interaction or knowledge from someone who has worked with them).Story continues below this adThen there are the AI chatbots. Most social media platforms now offer AI companions to users. On Instagram, you can slide into the DMs to have conversations with an AI. Grok’s AI companion, called Ani, based on Japanese anime, has been called a “girlfriend simulator”. Chatbot services like Replika and Character.ai offer different personas for companionship to millions of users.Besides these character-based companions, people are becoming increasingly dependent on chatbots like OpenAI’s ChatGPT, Google’s Gemini, or Microsoft’s Claude for ‘how-to’ advice, decision-making, or even confiding.A warning…Humans are inherently social creatures, driven by a deep need for connection and belonging. In that light, parasocial relationships aren’t inherently unhealthy. Research shows that when children identify with on-screen characters—such as Dora the Explorer—they often learn more effectively. These bonds can also support adolescents as they shape their identities.Parasocial ties can even nurture real communities, bringing together people with shared interests who may form genuine friendships.Story continues below this adThe trouble begins when these one-sided relationships serve as a substitute for actual human contact. When they do, they can encourage isolation and exacerbate mental-health challenges—a problem intensified by the rise of AI chatbots.Experts warn that we still don’t fully understand how sustained interaction with AI companions affects emotional well-being. A pair of recent studies by MIT Media Lab and OpenAI offer some clues. They found that “heavy users” of ChatGPT were more likely to consider it a “friend” or “attribute human-like emotions to it”. These heavy users, indulging in “personal” conversations with the chatbots, also reported the highest levels of “loneliness”, which were magnified if the user set the chatbot’s voice mode to the opposite gender. The studies, however, point to their own limitations in understanding whether heavy usage results in loneliness or if lonely individuals are more prone to heavy use and developing emotional bonds. They conclude that, in addition to technological guardrails to ensure that AI does not replace human connections, there’s also a need for “societal interventions” that foster human relationships.The need for these interventions is more urgent than ever. Around 10 per cent of the world’s population uses ChatGPT alone, according to OpenAI, and 70 per cent of their conversations are “non-work-related”. Tech platforms are racing to make chatbots more and more human. Billionaires are betting on the metaverse, a virtual space where human beings can interact with each other as digital avatars, as the internet’s future.So in choosing parasocial as its ‘Word of the Year’, Cambridge not only captures the cultural zeitgeist but reminds us of the need to preserve human connection.