Deadbots And Regulation: An Ethical And Legal Matter That Demands Discussion

By

The new European legal framework on artificial intelligence, the EU AI Act, came into force on August 1, aimed at preventing rights violations through the use of this technology. The legislation classifies AI according to the level of risk it may pose to individuals and society, and prohibits technologies that pose an “unacceptable risk,” such as those that manipulate and exploit people’s vulnerabilities.

One technology that could fall into this category is deadbots, which some companies are already developing and planning to market in the near future. These are chatbots based on the digital identity of a deceased person (WhatsApp messages, social media, emails, etc.) and which are capable of holding conversations with the deceased person’s family and friends, emulating their personality. Although it may seem like science fiction, it is not, and services of this type are closer than we may imagine.

Belén Jiménez, who holds a PhD in Psychology and is a member of the Faculty of Psychology and Educational Sciences and a researcher in the IN3 CareNet group at the Universitat Oberta de Catalunya (UOC), is a specialist in the technological mediation of grief. Part of her research focuses on deadbots, an area in which she has published several studies.“Certain precautions must be taken when using deadbots and it is essential to regulate their use, since the profit motive of the companies that market them may not be aligned with the potential therapeutic use”

A complex debate without clear answers

“Although deadbots have not yet been marketed, we need to reflect on the bioethical aspects of this technology. Their use may soon become normal, as has happened with other applications that may initially have surprised us, but which are now widely used, such as dating apps. More and more companies are emerging in what is is known as digital afterlife industry, and they are improving the technology,” Jiménez explained. She believes it is essential to “study how deadbots mediate grief and can transform it. It is a field in which there are hardly any scientific studies and there are no clear answers, since their use and effects depend on various factors, including how these technologies are designed.”

Among other things, the new European legislation stipulates that chatbots must inform the user that they are communicating with a computer program and not with a person. Although it classifies this technology as “limited risk”, in sensitive contexts such as health, which would be the case with deadbots, the implications of these programs must be carefully analysed.

Research carried out by Belén Jiménez, who is also a member of the CERPOP research group at the University of Toulouse, has shown that the bereaved display ambivalent attitudes to this new technology: the desire to maintain emotional ties with their loved ones is combined with an uneasiness that comes from interacting with a program based on the deceased person’s digital identity.

Deadbots are based on so-called “continuing bonds” between the bereaved and the deceased, a term frequently used in the psychology of grief. The UOC researcher said that “these technologies take advantage of people’s need to establish emotional bonds”. Indeed, they could be equivalent to an advanced and technological version of having an imaginary conversation with our loved one in front of their grave or preserving their memory through photographs and videos. “This need to maintain bonds doesn’t necessarily have to be pathological,” explains Jiménez, “and it is normal for many people. However, certain precautions must be taken when using deadbots and it is essential to regulate their use, since the profit motive of the companies that market them may not be aligned with the potential therapeutic use of this technology.”

In the absence of studies, Jiménez pointed out that the psychological effects of these technologies will depend on the users themselves, on the relationship they had with the deceased and the relationship they establish with the chatbot. “One of the dangers is that it could lead to negative effects, such as the creation of a relationship of dependency, and even suffering caused by a second loss, if the deadbot disappears – for example, due to technical problems.”Regulating the digital afterlife industry

Our desire for immortality and technological progress is stimulating the digital afterlife industry, a sector that exploits the digital presence of deceased people to perpetuate their memory and even extend their digital activity. This has many ethical and social implications. Companies pursue commercial and economic ends that may be in conflict with the potential therapeutic objectives of these tools. Strategies such as having deadbots send notifications and other actions to keep the bereaved “hooked” may be ethically questionable, according to Jiménez.

“We are dealing with a new technological development based on artificial intelligence, involving great risks, and it must be regulated to anticipate its possible negative effects, while we must also take its ethical dimension into account,” said the researcher. “The new European regulations focus on promoting the transparency of these technologies, which is essential in such sensitive areas as grief. In addition, companies that develop these services must comply with rigorous standards and invest in auditing, transparency and documentation programmes,” she explained. The AI Act provides for fines of up to €30 million or 6% of a corporation’s turnover if it fails to comply with the law.

In the absence of specific regulations for deadbots , Jiménez proposes that the regulations “should particularly ensure respect and dignity for the deceased person, as well as promoting the psychological well-being of the user, especially if they are grieving.”

Eurasia Review

Eurasia Review is an independent Journal that provides a venue for analysts and experts to disseminate content on a wide-range of subjects that are often overlooked or under-represented by Western dominated media.

Leave a Reply

Your email address will not be published. Required fields are marked *