Posted in

Family creates AI video to depict Arizona man addressing his killer in court

Family creates AI video to depict Arizona man addressing his killer in court

Chandler, AZ – May 9, 2025

In a groundbreaking and emotionally charged moment, the family of Christopher Pelkey, a 37-year-old Army veteran killed in a 2021 road rage shooting, used artificial intelligence to create a video of him delivering a victim impact statement during the sentencing of his killer, Gabriel Paul Horcasitas, on May 1, 2025, in Maricopa County Superior Court. Believed to be the first instance of AI being used in a U.S. courtroom for a victim impact statement, the video featured an AI-generated avatar of Pelkey expressing forgiveness, moving the judge, family, and attendees. The case has sparked discussions about the ethical and legal implications of AI in judicial settings.

The Incident and Creation of the AI Video

Christopher Pelkey, a devout Christian and Iraq/Afghanistan veteran, was fatally shot by Horcasitas on November 13, 2021, at a red light near Gilbert and Germann roads in Chandler, Arizona. According to court records, Pelkey exited his truck and approached Horcasitas’ car after a confrontation, prompting Horcasitas to fire two shots, one of which killed Pelkey. Horcasitas was convicted of reckless manslaughter in a 2025 retrial, following procedural issues in the initial 2023 trial.

Stacey Wales, Pelkey’s sister, struggled to craft a victim impact statement that captured her brother’s humanity. Inspired by his forgiving nature, she collaborated with her husband, Tim, and a tech-savvy family friend to create an AI-generated video. Using voice recordings, videos, and a single photo of Pelkey, they digitally recreated his likeness—complete with a trimmed beard, green sweatshirt, and no glasses or hat logo—to deliver a script Wales wrote. The process involved replicating Pelkey’s voice and speech patterns, inserting laughs, and blending real video clips, including one with an “old age” filter, to reflect his humor and personality.

The video, played during the May 1 sentencing hearing, began with the avatar clarifying, “I’m a version of Chris Pelkey recreated through AI,” acknowledging its artificial nature due to slight audio gaps and mismatched mouth movements. It thanked the 49 family and friends who submitted impact statements, then addressed Horcasitas: “It is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends. I believe in forgiveness and in God who forgives. I always have and I still do.” The avatar concluded with a call to “love one another and live life to the fullest,” ending with, “Well, I’m gonna go fishing now. Love you all. See you on the other side.”

Impact on the Court and Family

The video profoundly affected the courtroom. Maricopa County Superior Court Judge Todd Lang praised it, saying, “I loved that AI. Thank you for that. I felt like that was genuine; that his obvious forgiveness of Mr. Horcasitas reflects the character I heard about today.” Lang, moved by the video and the family’s 49 letters, sentenced Horcasitas to 10.5 years for manslaughter—exceeding the state’s 9.5-year request—and a total of 12.5 years, including an endangerment charge. He later requested a copy to share with peers, indicating its impact.

For Pelkey’s family, the video was cathartic. Wales kept it a surprise, allowing authentic reactions from relatives, including her teenage son, who thanked her for letting him “hear Uncle Chris one last time.” Her brother John described “waves of healing” from seeing Pelkey’s likeness express forgiveness, aligning with his known character. Wales told NPR that crafting the video—adjusting Pelkey’s appearance and voice—was a healing process, enabling her to revisit his photos after years of grief. She also recorded a nine-minute video of herself laughing and talking, reflecting on her mortality and ensuring her voice could be preserved for her family.

Legal and Ethical Considerations

The use of AI in this context was legally permissible under Arizona law, which allows victims to deliver impact statements in any digital format without prior disclosure to the defense or judge, as confirmed by victims’ rights attorney Jessica Gattuso. Gattuso, initially hesitant due to the novelty, supported the video after viewing it, believing its message of forgiveness would resonate, including with Horcasitas. Neither the defense nor Lang objected, though Horcasitas’ lawyer, Jason Lamm, filed an appeal notice hours after the sentencing, suggesting the judge may have given undue weight to the AI video, an issue likely to be reviewed.

AI experts, including Maura Grossman of the University of Waterloo, hailed the case as a novel application, likely the first U.S. use of AI for a victim impact statement. However, it raises ethical concerns. University of Colorado law professor Harry Surden warned that generative AI, while effective for humanizing victims, could manipulate emotions, bypassing “natural skepticism” as simulations become more realistic. Arizona State University’s Gary Marchant, part of the Arizona Supreme Court’s AI committee, noted that while the video’s value outweighed prejudice in this case, AI-generated evidence could be problematic in other contexts, especially if it misrepresents a victim’s wishes. Duke Law’s Paul Grimm emphasized the need to balance AI’s persuasive power with fairness to ensure it doesn’t distort judicial decisions.

Arizona courts are proactively addressing AI’s role. Chief Justice Ann Timmer, while not commenting directly on the case, highlighted the state’s AI committee, formed to study best practices. The committee is exploring risks like deepfake evidence, given AI’s accessibility—anyone can create such content on a phone. Arizona already uses AI for administrative tasks and to explain rulings on YouTube, but victim impact statements mark a new frontier. Timmer stressed that AI use must align with existing guidelines to avoid undermining a fair trial.

Broader Context and Public Reaction

The case reflects AI’s growing courtroom presence, following incidents like Michael Cohen’s 2023 use of fake AI-generated legal citations and a 2025 attempt to use an AI lawyer avatar, which was rejected. Unlike those cases, Pelkey’s video was embraced, likely due to its non-evidentiary role and emotional authenticity.

Public sentiment on X was mixed. @MarioNawfal and @sbauerAP highlighted the unprecedented nature of AI giving Pelkey a voice, framing it as a technological milestone. @JudgeFergusonTX noted the lack of objections and judicial impact, while @christoferguson expressed unease, stating, “This should have never been allowed.” @KatieConradKS criticized media for implying Pelkey himself spoke, preferring The Guardian’s “AI version” framing.

Significance and Future Implications

Wales defended the ethical use of AI, likening it to a hammer: “It can break a window or build a house.” She aimed to humanize Pelkey, ensuring Judge Lang saw his impact beyond autopsy photos shown in court. The video’s success—evident in the judge’s reaction and the family’s closure—underscores AI’s potential to amplify victims’ voices, particularly in sentencing phases where emotional appeals are permitted.

However, experts like Marchant and Surden warn of a “slippery slope.” Without regulation, AI could be used to fabricate statements or sway juries unfairly, especially in evidentiary phases. The U.S. Judicial Conference is developing standards to ensure AI-generated evidence meets reliability criteria, similar to expert testimony. Arizona’s proactive approach may set a precedent, but the Pelkey case highlights the need for clear guidelines as AI becomes more pervasive.

For Pelkey’s family, the video was a triumph of justice and healing, allowing a final farewell. For the legal system, it’s a test case, signaling AI’s transformative potential and the urgent need to balance innovation with fairness.

For more details, visit NPR or Reuters.