When Virality Is The Message: The New Age of AI Propaganda

Wait 5 sec.

—Photo-Illustration by TIME (Source Images: Andrew Harnik—Getty Images, Ronen Zvulun—AFP/Getty Images, U.S. Navy/Getty Images, Jakub Porzycki—NurPhoto/Getty ImagesWelcome to the world of AI-driven propaganda. In March 2026, AI-generated videos depicting American President Donald Trump and Israeli Prime Minister Benjamin Netanyahu as LEGO minifigures began flooding social media. Some were set to original rap tracks with surprisingly catchy hooks, likely also AI-generated. They told stories of the horrors of war: little shoes and a plastic backpack near rubble evoked the bombing of the Shajareh Tayyebeh girls' school in Minab, Iran. Some connected the war to Trump’s ties to the convicted sex offender, Jeffrey Epstein. They mockingly evoked the brutal defeat of American forces, with LEGO toy soldiers walking into rivers of blood, or returning home in tiny caskets draped in American flags. Media coverage attributed the videos, some of which were broadcast on Iranian state television in addition to spreading online, to the Revayat-e Fath Institute. The name translates to “The Narration of Victory"; coverage linked the content to the Islamic Revolutionary Guard Corps, which has been implicated in other influence operations and hacking campaigns. However, most of the viral videos bear the logo of the “Explosive News Team,” a self-described grassroots group who have claimed credit on X for being “that Iranian Lego animation guys.” On the afternoon of March 28, they complained that their YouTube and Instagram accounts had been taken down. In email correspondence with The New Yorker, they claim to be student-run and totally independent; Revayat-e Fath, they say, is the Persian title of their videos.  Are Explosive News “patriotic trolls” producing heartfelt content to support Iran’s ruling regime, paid mercenaries leveraged for plausible deniability, or a state-run account posturing as independent? Social media platforms that have taken their accounts down will have to justify their decisions in integrity reports. Other videos are more obviously state-linked: an Iranian Embassy X account posted an AI animation mocking Trump using visuals styled after the Pixar movie “Inside Out.” Generative AI has made it cheap and easy to produce polished propaganda at scale, and just as easy to blur the line between official messaging and opportunistic imitation. Packaging war in the visual language of entertainment makes conflict propaganda more likely to spread, regardless of who made it. Social media is an open playing field: any government, proxy group, or anonymous account can compete for the same audience, and because users are active participants, the most compelling content wins the most reach regardless of its origin or intent. The trend is international. The White House has been creating AI-generated content, too. It posted a video promoting Operation Epic Fury that opened with a cursor clicking “Start” in the unmistakable style of Nintendo’s Wii Sports gaming system, then cut between war footage (ordnance detonating against Iranian targets) and cartoon bowling strikes. Videos seen on the White House X account depicting different video games cut with war footage. —The White House/XOther videos shared by the White House spliced scenes from Call of Duty with real airstrike footage, clips from Grand Theft Auto: San Andreas and Braveheart, and audio nods to Top Gun and Mortal Kombat. The Iranian videos tell stories of horror and of Trump’s humiliation; the White House clips project dominance and military might. But both package war in the familiar language of entertainment.  Governments communicating through memes and toy animations during an actual shooting war may feel like a bizarre aberration. But this is what overt propaganda looks like in a platform age. The underlying logic works like this: borrow the visual language of games, memes, and children’s toys; place them in contexts jarring enough to capture attention; and let engaged audiences do the rest. Social media users don’t need to endorse a message to spread it. They only need to find it compelling enough to watch and share.For more than a decade, states have treated social media as core infrastructure for influence. They run full-spectrum propaganda strategies that push messaging through multiple channels simultaneously: state television, official accounts, covert troll networks, and sympathetic influencers who may not even know they’re part of the operation. The result is not just a broadcast. It  turns the audience into a distribution channel larger than any the state could build on its own. Propaganda today is participatory; things don’t just “go viral” on their own — people engage with content. The template was arguably set in 2015, when the Islamic State (ISIS) terrorist group released No Respite, a four-minute English language recruitment video that premiered about a week after the Bataclan attacks in Paris. Every frame was built to look like something a Western teenager steeped in gaming culture would find appealing: tight cuts, pacing straight out of an action-movie trailer. It was made both to mock the coalition forces and to make ISIS feel like an aspirational brand. Platforms played whack-a-mole as “swarms” of intrigued users shared it. Researchers began to speculate about the future of memetic warfare. LEGO-specific propaganda entered the state-media repertoire universe in April 2020 when Chinese’s Xinhua News Agency released “Once Upon a Virus,” a video in which LEGO minifig terra cotta warriors and doctors in masks sparred with an increasingly feverish, belligerently chaotic Statue of Liberty about the coronavirus pandemic. “We are always correct, even though we contradict ourselves,” she says, hooked up to an IV. Posted on YouTube and Twitter—both of which are blocked within China, but which it uses for overseas messaging—it was viewed millions of times. There were glaring strategic omissions in the video: Chinese doctor Legos warning the Statue of Liberty that the virus was bad seemed brazenly ironic to those of us who followed China’s aggressive policing of doctors as the pathogen began to spread. But the format wrapped a simple message—“the U.S. is failing”—in a mockingly funny, childlike aesthetic that made it easy to watch, and frictionless to share. The video bypassed the skepticism that a state media editorial, or even just real human actors, would have triggered, while poking at real incoherencies in America’s pandemic response. You didn’t have to like the Chinese Communist Party’s worldview to forward it to a friend. You could pass it along in frustrated agreement with a grain of truth, or simply because it was surreal: “Can you believe the Chinese government made this?” Russia picked up the LEGO aesthetic as well. Ahead of Moldova’s 2025 parliamentary elections, Russian propagandists circulated images of fabricated LEGO sets depicting soldiers with Ukrainian and Moldovan flags, designed to stoke fears that Moldova would be dragged into the war if it supported certain parties. The plastic brick, it turns out, is remarkably versatile as an instrument of statecraft. What connects all of these cases—ISIS’ action-trailer aesthetic, China, Russia, and Iran’s LEGOs, the U.S.’ video game edits—is a shared recognition of how information moves online. The currency of social media is not authority or accuracy, it is engagement. The content that travels fastest combines familiarity with novelty: a trope people instantly recognize, deployed in a setting jarring enough to make them react. That is why propaganda now so often arrives as a meme, parody, or spectacle. Users do not have to agree with it to help distribute it. They only have to engage with it.Ridicule and satire are especially popular in this environment because both are captivating, and very difficult to counter. A factual rebuttal to a LEGO animation or Wii-style bombing meme almost inevitably looks plodding, humorless, and tonally mismatched. The downstream consequence of all this is that the spectacle reaches more people than the reality. White House press secretary Karoline Leavitt bragged that the White House's videos had generated more than 2 billion impressions. Some analyst commentary argued that Iran’s LEGO videos had outpaced Trump’s. Regardless of who is winning the virality contest, both figures dwarf the reach of any individual news report about the actual events in question. Stories of a bombed school, military casualties, and burning oil fields are processed through memes. Increasingly, people encounter war first as content, and only later, if at all, as news. As propaganda theorist Jacques Ellul argued in the early 1960s, propaganda evolves with the communication systems that carry it. In a social media environment shaped by algorithms, virality, and now generative AI, propaganda increasingly takes the form not of doctrine, nor even of messaging optimized to persuade, but as content made to travel. State-run accounts can generate an endless stream of Lego animation, or even deepfaked battle footage, for as long as audiences appear interested in engaging with it. Copycat accounts—some state-linked, others simply chasing revenue or clout—can flood the zone with variants. That blurs attribution and complicates moderation, forcing platforms to make difficult and increasingly opaque judgments about accounts like Explosive News. What counts as state propaganda, what counts as coordinated manipulation, and what remains in bounds?When the topic of propaganda is raised, the question that most often follows relates to persuasion: Do these videos actually change minds? Sometimes they may. The Iranian LEGO videos are clearly designed to undermine support for the war. But that is too narrow a measure of success. The deeper effect is environmental. Viral propaganda creates the atmosphere through which a conflict is perceived: it shapes what feels salient, what seems ridiculous, who seems triumphant, what feels righteous. The White House videos aren’t trying to convert opponents; they’re performing dominance for an audience that already supports the war (even as more than half of the American public does not). But when the meme becomes the primary text and the news itself remains in the background, the spectacle doesn’t have to change your mind. It just has to win the war for the attention of your target audience.