Experts have issued an urgent warning over an AI tool which 'brings back the dead', branding the tech an 'ethical minefield' which could have 'devastating' consequences.
Technology continues to move at a rapid pace and the work being put into artificial intelligence tools is forever taking new leaps forward.
However, there are not always considerations for the moral or ethical issues regarding emerging tech.
Advert
Researchers at the University of Cambridge have begun warning about the future of some AI tools, most disturbingly regarding a tool that could allow users to hold text and voice conversations with lost loved ones.
In a paper entitled ‘Digital afterlife’: call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones released on May 8, researchers warn about the wider issues with, well, 'talking' to the dead.
“‘Deadbots’ or ‘Griefbots’ are AI chatbots that simulate the language patterns and personality traits of the dead using the digital footprints they leave behind,” the paper explains.
Advert
“Some companies are already offering these services, providing an entirely new type of ‘postmortem presence’."
And if you think this sounds like something straight out of the Black Mirror TV show... that's because it basically is.
An entire episode was dedicated to this very concept in the second season, and it highlighted the dangers and potential psychological harm that a person can go through using this kind of tool.
Dr Katarzyna Nowaczyk-Basińska, study co-author and researcher at Cambridge’s Leverhulme Centre for the Future of Intelligence, emphasized why tools like this can prove dangerous and advised caution within the industry.
Advert
She said: “This area of AI is an ethical minefield. It’s important to prioritize the dignity of the deceased, and ensure that this isn’t encroached on by financial motives of digital afterlife services, for example.
“At the same time, a person may leave an AI simulation as a farewell gift for loved ones who are not prepared to process their grief in this manner. The rights of both data donors and those who interact with AI afterlife services should be equally safeguarded.”
Co author Dr Tomasz Hollanek reiterated these points and suggested it would be crucial to implement a system that ensures the individual can eventually cut ties with the digital person, possibly holding a funeral.
Advert
“People might develop strong emotional bonds with such simulations, which will make them particularly vulnerable to manipulation," he said.
“It is vital that digital afterlife services consider the rights and consent not just of those they recreate, but those who will have to interact with the simulation.
“These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating."
Advert
He continued: “Methods and even rituals for retiring deadbots in a dignified way should be considered. This may mean a form of digital funeral, for example, or other types of ceremony depending on the social context.
“We recommend design protocols that prevent deadbots being utilised in disrespectful ways, such as for advertising or having an active presence on social media.”
Only time will tell if these sort of AI tools actually do come to fruition without adequate safety nets.
Topics: Technology, Artificial Intelligence