When technology brings words back: how AI allowed a deceased victim to speak in court
- Diane Sieger

- May 16
- 2 min read
Artificial intelligence recently broke new ground in the American legal system, allowing a deceased victim to "speak" at his killer's sentencing. The case of Christopher Pelkey, a victim in a road rage incident in Arizona, has sparked international attention for its innovative and controversial use of AI in the courtroom.

The technology behind the courtroom statement
Pelkey’s family worked with technologists to produce a nearly four-minute video, where an AI-generated version of Christopher appeared to address his killer. News sources report that audio and video materials were used to recreate his likeness and convey his message.
While it is likely that projects like this draw on personal materials such as social media posts, videos, or voice recordings, it remains unclear whether Christopher’s own writings or messages were part of the process. What is clear, however, is that his family went to great lengths to create a statement they believed truly reflected his voice and character.
Why families turn to AI for victim impact
Victim impact statements are a vital part of sentencing hearings, giving families a direct way to communicate the harm a crime has caused. Traditionally read aloud in person or submitted on paper, these statements can be deeply emotional.
With the rise of AI, families like Pelkey’s now have the option to create a more personal and immersive experience by presenting the victim’s own likeness in court, hoping the statement feels more direct and meaningful.
The legal and ethical discussions
While many find this use of technology moving, others feel uneasy. Legal experts have pointed out that a digital recreation could heighten emotional responses in ways we cannot fully predict.
NBC News and other sources have raised concerns about consent, as the deceased individual cannot approve the message or the use of their likeness. The legal system may have to consider new guidelines about how and when such technology should be used, and under what conditions it is ethical.
Is this a sign of things to come?
The Arizona case may be only the beginning. As AI technology improves and becomes more accessible, more families could rely on it to make their voices heard in new ways.
Commentators expect this won’t be the last time an AI-generated statement is seen in court. The broader impact on law, ethics, and grieving remains an open question.
Final thoughts
The use of AI to allow Christopher Pelkey to “speak” from beyond the grave is both emotional and thought-provoking. As the legal world grapples with the challenges posed by this new technology, society will have to consider when and how digital recreations are appropriate.
This story provides a glimpse into a future where technology and humanity intersect in unexpectedly personal ways.
It also highlights the growing need for all of us to think about our own digital footprint and to make plans for how we want our online presence and personal data to be handled after we are gone.



Comments