When a loved one passes away, we’re often left bereft and longing to see them just one more time – to gaze upon their face, to hear their voice, to ask advice.
Now, thanks to AI, you can. UWM student researcher Fridarose Hamad is not so sure that’s a good thing.
“(There) are these things called ‘grief bots,’ that are replications of people after they pass that aid people in mourning them,” she explained. “When we start interrogating it, more and more questions start to arise.”
Hamad is a senior majoring in English and political science. She’s also a prolific student researcher. She began even before she started classes through the UR@UWM program, and then continued to work under her mentors, Associate Professor of Sociology Celeste Campos-Castillo (now at Michigan State University) and UWM Associate Professor of Public Health Policy Linnea Laestadius. Hamad’s previous research focused on how teens interact with technology, but lately, she’s been focused on “grief tech” – technology that uses AI to develop profiles of departed loved ones so that the living can talk, text, or interact with their likeness.
Hamad focus on her work is impressive; earlier this year, she was named a UWM Senior Excellence in Research Award (SERA) winner and plans to present her research at conferences like Research in the Rotunda and the National Conference for Undergraduate Research in Pittsburgh.

The technology of grief
Grief tech existed before the advent of chatbots, but thanks to AI, it is expanding into new avenues and is evolving every day. Some companies allow a person to create a bot that can talk to their loved ones after they pass away. Companies like You Only Virtual specifically offer to construct grief bots for people in mourning. Services like Replika, which seeks to provide users with an AI companion, have been used to create grief bots as well. One Japanese company even builds virtual reality simulations so that people can interact with their deceased pets.
“You go onto one of these websites, and you create a chat bot feature so that you can continue to text Grandma after she’s passed,” Hamad said. “You might upload photos of her, audio clips, any other text messages that you might have from her. You’re uploading all of this content to get the fullest picture of Grandma possible.”
Does this technology actually help the grieving process? It’s hard to say. The field is so new that researchers have only begun to look into grief tech, and what research exists uses a patchwork of vocabulary as each researcher comes up with their own definitions to describe the same functions and features.
That’s why Hamad’s work is important.
“My goal with this project became to create some continuity. Let’s create some set definitions. Let’s create the framework that we can continue to dissect this with,” she said. “I read about 40 or 50 papers with just completely different language all describing the same things. And I know that because all of them reference the same episode of ‘Black Mirror’ or they’ll reference the same companies.”
For example, Hamad draws a distinction between “ghost bots” – AI meant to mimic a deceased person like Napoleon or Queen Elizabeth II – and “grief bots” – AI meant to mimic a deceased person whom another person is mourning – whereas other researchers had been using the terms interchangeably.
Hamad is working on a paper that she hopes will give researchers a common vocabulary to draw from as they explore questions surrounding the psychological, technical, and ethical aspects of grief bots.
Grief tech raises questions
And it’s important to study grief tech because, like many aspects of AI, the technology is so new that there are no policies in place to govern its use. And there are a lot of questions about its use.
“You’re uploading a lot of personal information onto these sites. What if the site gets hacked?” Hamad pointed out.
But more than data safety is the issue of personal privacy and the ethical concerns that are raised as a result, she added. The person who is deceased can’t give consent for their likeness to be used for a chatbot. Would they have wanted their personality uploaded online? What if a teenager’s friend passes away and they create a grief bot to help them cope? Is that fair to the parents of the deceased teen? What happens when a person is ready to move on? Is it okay to delete the grief bot, or would it be akin to ‘killing’ their loved one? What are the responsibilities of grief tech companies? What if the person someone is grieving is an ex-partner from a relationship who is still alive? Is it ethical to make a bot based on their personality to work through the breakup?
“These are only some of the issues, but I think it hints at the bigger questions that we have around what this looks like and how it functions,” Hamad said. “Before we start regulating it, there needs to be more work done in understanding the full scope of (grief tech).”
She hopes that she can impress that message upon lawmakers when she presents her work at Research in the Rotunda in April.
In the meantime, Hamad stresses that grief is a natural part of being human. Perhaps grief bots can help people give people a sense of closure, or perhaps it hinders their healing process by preventing them from processing a new normal. Either way, grief tech is another example of how technology is influencing our lives – and our deaths.
By Sarah Vickery, College of Letters & Science