Florida man fabricated an AI video of two Black men breaking into a police car to go viral, then got arrested in Puerto Rico trying to escape it

Wait 5 sec.

Alexis Martínez-Arizala, a 22-year-old from Lake Worth Beach, Florida, has been arrested in Puerto Rico on charges of making a false report of a crime and tampering with or fabricating physical evidence. He allegedly created an AI-generated video showing two Black men breaking into a police car. The arrest came after he tried to use the fake video to get attention online. The incident began on March 24, when Martínez-Arizala approached a deputy inside a store in Lake Mary, Florida, and told him he had just seen two people getting into the deputy’s patrol vehicle. He then showed the deputy a three-second video that appeared to show two Black men entering what looked like the deputy’s car. Martínez-Arizala, listed as white in the police report, presented the video as real evidence. According to the South Florida Sun Sentinel, the deputy was not convinced for long. After Martínez-Arizala left, the deputy checked his patrol car and found nothing disturbed or stolen. He then reviewed the store’s outside surveillance footage, which showed that no one had approached the car except Martínez-Arizala himself, who was seen walking toward the vehicle with his phone out just before entering the store. AI deepfake tools have become dangerously easy for anyone to access and misuse After reviewing the video several times, the deputy noticed clear inconsistencies. The patrol car in the video did not say “Seminole Sheriff” on the doors. The two men were not visible through the windshield after entering the vehicle, and the rear door to the prisoner compartment opened and closed on its own, classic signs of an AI-generated deepfake. Investigators also found that Martínez-Arizala had made social media posts about the encounter, seemingly trying to make it go viral. He was eventually arrested in Puerto Rico, faces serious charges, and will be extradited to Seminole County, Florida. He is being held on a $7,000 bond. Florida has seen no shortage of shocking criminal cases lately, including a disturbing case where a man lived with a patient’s corpse for years and walked free due to a legal loophole. Authorities have arrested a South Florida man accused of creating and sharing a deceptive AI-generated video involving a law enforcement officer, officials said. https://t.co/67hk5el619— FOX 35 Orlando (@fox35orlando) April 8, 2026 This case is part of a much larger and growing problem. Experts say deepfake fraud has gone “industrial,” with tools that create convincing fake videos now being cheap and easy to find. Simon Mylius, an MIT researcher linked to the AI Incident Database, said “capabilities have suddenly reached that level where fake content can be produced by pretty much anybody.” He noted that “frauds, scams and targeted manipulation” made up the largest share of incidents reported to the database in 11 of the past 12 months. “It’s become very accessible to a point where there is really effectively no barrier to entry,” he said. Fred Heiding, a Harvard researcher studying AI-powered scams, added, “The scale is changing. It’s becoming so cheap, almost anyone can use it now. The models are getting really good, they’re becoming much faster than most experts think.” Real-world examples include a deepfake video of Western Australia’s premier promoting an investment scheme and fake doctors pushing suspicious skin creams online. Last year, a finance officer at a Singaporean company paid out nearly $500,000 to scammers after believing he was on a video call with company leadership. In the UK, consumers are estimated to have lost £9.4 billion to fraud in just nine months leading up to November 2025. Jason Rebholz, the chief executive of AI security company Evoke, experienced a deepfake job scam firsthand after posting a job offer on LinkedIn. According to The Guardian, a stranger referred to a candidate who looked impressive on paper, but during the video call, things were off. “The background was extremely fake,” Rebholz said. “It just looked super, super fake. And it was really struggling to deal with [the area] around the edges of the individual. Like part of his body was coming in and out… And then when I’m looking at his face, it’s just very soft around the edges.” A deepfake detection firm later confirmed the video was AI-generated. Rebholz rejected the candidate but still does not know what the scammer was after. His takeaway: “It’s like, if we’re getting targeted with this, everyone’s getting targeted with it.” Heiding warns the worst may still be ahead. While deepfake voice cloning is already very convincing, video technology still has room to improve. When it fully matures, he says it could lead to “the complete lack of trust in digital institutions, and institutions and material in general.” Florida continues to make headlines for unexpected dangers, as seen when a Florida man nearly lost his life after stopping to help a crash victim. Seminole County Sheriff Dennis Lemma also issued a strong statement on the Florida case: “The misuse of artificial intelligence to create deepfake videos is a growing concern, particularly when it targets public safety professionals. These fabricated videos can damage reputations, create unnecessary tensions, and raise real safety concerns for the first responders who serve our communities.”