• Email Us: [email protected]
  • Contact Us: +1 718 874 1545
  • Skip to main content
  • Skip to primary sidebar

Medical Market Report

  • Home
  • All Reports
  • About Us
  • Contact Us

AI Ethicists Highlight Three Horrifying Scenarios In Which Griefbots Could Stalk The Living

May 10, 2024 by Deborah Bloomfield

Speaking to the dead is now a reality as artificial intelligence (AI) technology has made “deadbots” and “griefbots” possible. These chatbots can simulate the language and personality of our deceased nearest and dearest, providing comfort for those who are grieving, but University of Cambridge scientists warn that griefbots could cause more harm than good, creating digital “hauntings” that are lacking in safety standards.

Advertisement

The ethics of grief tech were raised by one man’s experience with a tool known as Project December. As an early release of the AI technology ChatGPT-3, Project December offered paying users the chance to speak with preset chatbot characters or use the machine learning technology to create their own. Writer Joshua Barbeau was a user who went on the record with his experiences teaching Project December to speak like his fiancée, who at the time had been dead for over eight years.

Advertisement

By feeding the AI samples of her texts and a personal description, Project December was able to piece together lifelike responses using language models to mimic her speech in text-based conversation. The authors of a new study argue that these AI creations, based on the digital footprints of the deceased, raise concerns about potential misuse, which – grim as it is to contemplate – include the possibility of advertising being slipped in under the guise of our loved one’s thoughts.

They also suggest that such technologies may further distress children grappling with the death of a loved one by maintaining the illusion that their parent is still alive. It’s their concern that in doing so, griefbots don’t honor the dignity of the deceased, at the cost of the wellbeing of the living.

These thoughts were mirrored by psychologist Professor Ines Testoni of the University of Padova, who told IFLScience that the biggest thing we have to overcome after the death of a loved one is facing the fact that they are no longer with us.

“The greatest difficulty concerns the inability to separate from those who leave us, and this is due to the fact that the more you love a person, the more you would like to live together with them,” Testoni told IFLScience for the March 2024 issue of CURIOUS. “But also, the more one loves one’s habits, the more one wants to ensure that they do not change. These two factors make the work involved in separating and resetting a life that is different from what it was before death entered our relational field very time-consuming.”

Advertisement

The suffering that comes with that is something Testoni states is related to a lack of understanding of what it means to die. Much of the discourse surrounding what happens after we die is conceptually vague, making it tempting to look for evidence and find it wherever we can when we’re struggling to let go.

“A vast literature describes the phenomenon of continuing bonds, i.e. the psychological strategies of the bereaved to keep the relationship with the deceased alive,” explained Testoni. “Death education can help to deal with these kinds of experiences by allowing us to become aware of these processes and especially to understand where the doubt about the existence beyond death comes from, which leads us to painfully question where the deceased is.”

To demonstrate their concerns, the Cambridge AI ethicists outline three scenarios in which griefbots could be harmful to the living:

  • MaNana – a conversation AI service that simulates the dead, such as a grandmother, without the consent of the “data donor,” aka Grandma. It may run on a premium trial that, when it comes to an end, starts to implement advertising in the form of Grandma suggesting the bereaved order food from a certain delivery service.
  • Paren’t – A terminally ill woman may leave a griefbot that simulates her own personality with the goal of assisting a child through their grief after she has died. The griefbot may initially provide comfort, but if it starts to generate confusing responses, such as suggesting an in-person meet-up, it could delay the child’s healing.
  • Stay – An adult may create a griefbot of themselves to engage with their children after they have died, but that’s not to say that all the children want it. One of the children, now an adult, may wish to disengage with the tech, but instead be barraged with emails from their dead parent. Suspending the griefbot may feel as if it violates the wishes of the deceased, resulting in guilt and distress as the living feel they have no way out.

“We must stress that the fictional products represent several types of deadbots that are, as of now, technologically possible and legally realizable,” wrote the authors. “Our scenarios are speculative, but the negative social impact of re-creation services is not just a potential issue that we might have to grapple with at some point in the future. On the contrary, Project December and other products and companies mentioned in [the study] illustrate that the use of AI in the digital afterlife industry already constitutes a legal and ethical challenge today.”

Advertisement

They urge that griefbots should be crafted with consent-based design processes that implement opt-out protocols and age restrictions for users. Furthermore, if we’re to bring the dead back to life in the form of a chatbot, we’re going to need a new kind of ceremony to retire the griefbots respectfully, raising the question that if we are going to have to lose a loved one all over again, is such technology simply delaying the healing process?

“Rapid advancements in generative AI mean that nearly anyone with Internet access and some basic know-how can revive a deceased loved one,” said Dr Katarzyna Nowaczyk-Basińska, study co-author and researcher at Cambridge’s Leverhulme Centre for the Future of Intelligence (LCFI), in a statement. 

“This area of AI is an ethical minefield. It’s important to prioritise the dignity of the deceased, and ensure that this isn’t encroached on by financial motives of digital afterlife services, for example. At the same time, a person may leave an AI simulation as a farewell gift for loved ones who are not prepared to process their grief in this manner. The rights of both data donors and those who interact with AI afterlife services should be equally safeguarded.”

The study is published in Philosophy & Technology.

Deborah Bloomfield
Deborah Bloomfield

Related posts:

  1. Events leading up to the trial of Theranos founder Elizabeth Holmes
  2. “Man Of The Hole”: Last Known Member Of Uncontacted Amazon Tribe Has Died
  3. This Is What Cannabis Looks Like Under A Microscope – You Might Be Surprised
  4. Will Lake Mead Go Back To Normal In 2024?

Source Link: AI Ethicists Highlight Three Horrifying Scenarios In Which Griefbots Could Stalk The Living

Filed Under: News

Primary Sidebar

  • Humans Have A “Seventh Sense” That Lets You Touch Things From A Distance
  • The Longest Place Name Has 111 Letters – And It’s Visited By Millions Of People Each Year
  • We Now Know Why Neanderthal Faces Looked So Different To Our Own
  • Why Does Africa Have So Many Of The World’s Largest Land Animals?
  • This “Ant-Mimicking” Spider Produces Its Own Kind Of Milk And Nurses Its Babies
  • 1972 Was The Longest Year In Modern History – Here’s Why
  • Why Did “Magic Mushrooms” Evolve To Be Hallucinogenic – What’s In It For The Mushrooms?
  • Why Can’t You Domesticate All Wild Animals? The Process Relies On 6 Characteristics Few Mammals Possess
  • Meet Some Of Earth’s Mightiest Predators
  • Canada Officially Loses Its Measles Elimination Status After Nearly 30 Years. The US Is Not Far Behind
  • Two “Anomalies” Detected In Egypt’s Menkaure Pyramid Using Electrical Resistance Tomography
  • Invasive “Tree Of Heaven” Unleashes Hell As “Double Invasion” Sweeps Across Virginia
  • Hamman’s Crunch: A Man Covered His Nose And Mouth Whilst Sneezing And Ended Up In Hospital
  • “One Of The Most Beautiful Experiments In Evolutionary Biology”: What The Peppered Moth Taught Us About Evolution
  • Why Do Microwaved Eggs Explode When You Bite Into Them?
  • First-Ever At-Home LSD Microdosing Trial For Depression Sees 60 Percent Improvement In Symptoms
  • People Are Just Learning What A Baby Turkey Is Called
  • Enceladus’s North Pole Is Leaking Heat, Indicating Its Ocean Is Ancient And Boosting Prospects For Life
  • Speaking Multiple Languages May Be A Secret Weapon Against The Ravages Of Old Age
  • The World’s Largest Monkey Roams The Forest In “Hordes” Of Over 800 Individuals
  • Business
  • Health
  • News
  • Science
  • Technology
  • +1 718 874 1545
  • +91 78878 22626
  • [email protected]
Office Address
Prudour Pvt. Ltd. 420 Lexington Avenue Suite 300 New York City, NY 10170.

Powered by Prudour Network

Copyrights © 2025 · Medical Market Report. All Rights Reserved.

Go to mobile version