Law enforcement has quickly embraced AI for everything from drafting police reports to facial recognition.The results have been predictably dismal. In one particularly glaring — and unintentionally comedic — instance, the police department in Heber City, Utah, was forced to explain why a police report software declared that an officer had somehow shapeshifted into a frog.As Salt Lake City-based Fox 13 reports, the flawed tool seems to have picked up on some unrelated background chatter to devise its fantastical fairy tale ending.“The body cam software and the AI report writing software picked up on the movie that was playing in the background, which happened to be ‘The Princess and the Frog,'” police sergeant Rick Keel told the broadcaster, referring to Disney’s 2009 musical comedy. “That’s when we learned the importance of correcting these AI-generated reports.”The department had begun testing an AI-powered software called Draft One to automatically generate police reports from body camera footage. The goal was to reduce the amount of paperwork — but considering that immense mistakes are falling through the cracks, results clearly vary.Even a simple mock traffic stop meant to demonstrate what the tool is capable of turned into a disaster. The resulting report required plenty of corrections, according to Fox 13.Despite the drawbacks, Keel told the outlet that the tool is saving him “six to eight hours weekly now.”“I’m not the most tech-savvy person, so it’s very user-friendly,” he added.Draft One was first announced by police tech company Axon — the same firm behind the Taser, a popular electroshock weapon — last year. The software makes use of OpenAI’s GPT large language models to generate entire police reports from body camera audio.Experts quickly warned that hallucinations could fall through the cracks in these important documents.“I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing,” American University law professor Andrew Ferguson told the Associated Press last year.Others warn that the software could further pre-existing racial and gender biases, a troubling possibility considering law enforcement’s historic role in perpetuating them long before the advent of AI. Generative AI tools have also been shown to perpetuate biases against both women and non-white people.“The fact that the technology is being used by the same company that provides Tasers to the department is alarming enough,” Foundation for Liberating Minds in Oklahoma City cofounder Aurelius Francisco told the AP.Critics also argue that the tool could be used to introduce deniability and make officers less accountable in case mistakes were to fall through the cracks. According to a recent investigation by the Electronic Frontier Foundation, Draft One “seems deliberately designed to avoid audits that could provide any accountability to the public.”According to records obtained by the group, “it’s often impossible to tell which parts of a police report were generated by AI and which parts were written by an officer.”“Axon and its customers claim this technology will revolutionize policing, but it remains to be seen how it will change the criminal justice system, and who this technology benefits most,” the Foundation wrote.The Heber City police department has yet to decide whether it will keep using Draft One. The department is also testing a competing AI software called Code Four, which was released earlier this year. But considering Draft One’s inability to distinguish between reality and a make-belief world dreamed up by Disney, let’s hope the department thinks long and hard about the decision.More on AI policing: AI Is Mangling Police Radio Chatter, Posting It Online as Ridiculous MisinformationThe post Cops Forced to Explain Why AI Generated Police Report Claimed Officer Transformed Into Frog appeared first on Futurism.