Composite image: Arrington Watkins Architects / AI avatar: YouTube/StaceyWales, CC BYIn November 2021, in the city of Chandler, Arizona, Chris Pelkey was shot and killed by Gabriel Horcasitas in a road rage altercation. Horcasitas was tried and convicted of reckless manslaughter.When it was time for Horcasitas to be sentenced by a judge, Pelkey’s family knew they wanted to make a statement – known as a “victim impact statement” – explaining to the judge who Pelkey had been when he was alive.They found they couldn’t get the words right.The solution for them turned out to be having Pelkey speak for himself by creating an AI-generated avatar that used his face and voice, allowing him to “talk” directly to the judge. In Arizona, a judge allowed an AI avatar of a deceased crime victim to “read” an impact statement. This marked the first time a United States court had allowed an AI-generated victim to make this kind of beyond-the-grave statement, and likely the first time something like this had occurred anywhere in the world.How was the AI avatar made and received?The AI avatar was created by Pelkey’s sister Stacey Wales and her husband Tim, with Stacey writing the words “spoken” by Pelkey – words that were not taken from anything he actually said when he was alive but based on what she believed he would have said. Stacey Wales explained how she came to create an AI video of her brother to allow him to deliver his own victim impact statement. The avatar was created by using samples of Pelkey’s voice from videos that had been recorded before his death and photos the family had of him – specifically a photo used at his funeral.In the video, Pelkey “says” he believes in forgiveness and “a God who forgives”, and that “in another life” he and Horcasitas could have been friends.After the video was played in court, Judge Todd Lang, who had allowed the AI statement to be delivered, stated he “loved” the AI, adding he “heard the forgiveness” contained in it. He further stated he felt the forgiveness was “genuine”. Judge Todd Lang’s reaction to Chris Pelkey’s AI victim impact statement. In the end, Horcasitas was sentenced to the maximum of ten-and-a-half years – more than the nine years the prosecution was seeking but equal to what Pelkey’s family asked for in their own victim impact statements.Could this happen in Australia?In general, court rules are similar across Australian states and territories and it would be unlikely these technological advances would be acceptable in Australian sentencing courts. These rules allow victims or their families to read their statement to courts, but this is limited to written statements usually edited by the prosecution, although victims may include drawings and photos where approved. A victim will generally read their own statement to the court. However, where the victim has died, family members can make a statement speaking to their own trauma and loss. Sometimes victims ask the prosecutor to read their statement, or the prosecutor merely hands over a written statement to the judge.To date, no Australian court has permitted family members to speak for the deceased victim personally and family members are generally limited to describing harms they have directly suffered. Victims may also be cross-examined by defence counsel on the statements’ content. Creating an AI avatar would be time-consuming and expensive for prosecutors to edit. Cross-examination by the defence would be impossible.Compared to the US, there is generally far less tolerance in Australian courts for dramatic readings of statements or using audio-visual materials. In the US, victims enjoy greater freedom to invoke emotions, explore personal narratives and even show videos of the deceased, all to give the court a better sense of the victim as a person. The use of an AI avatar, therefore, is not too far from what is already allowed in most US courts.Despite these allowances, there is still concern the emotional impact of a more direct statement from an AI victim could be used to manipulate the court by putting words into the victim’s virtual mouth.As can be seen in the Arizona sentencing, Judge Lang was clearly affected by the emotions generated by the AI Pelkey.Changes to Australian law would be needed to ban use of AI recordings specifically. But even without such changes, Australian sentencing practice is already so restrictive as to essentially preclude such technology. It seems Australia is some ways from joining Arizona in allowing an AI avatar of a deceased person speaking from “beyond the grave”.The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.