Opinion | Can a chatbot help you overcome your pain?

An older Korean man named Mr. Lee, dressed in a blazer and trousers, holds the chair arms and tilts towards his wife. “Treasure, I’m me,” he says. “A long time has passed.”
“I would never have expected that this would happen to me,” he replies through tears. “I’m so happy right now.”
Mr. Lee died. His widow is speaking with a similarity based on artificial intelligence projected on a wall.
“Please, never forget that I’m always with you,” says the projection. “Stay healthy until we meet again.”
This conversation was shot as part of a promotional campaign for kings; Memory, an artificial intelligence tool created by the Korean start-up Deepbrain AI, which offers a professional level and green screen recording (as well as relatively cheap ways of self-regulation) to create representations of lifelikes of the dead.
It is part of a growing market of products AI that promise users an experience that approaches close to the impossible: communicate and even “gather” with the deceased. Some of the representations – such as those offered by Ahiorfter Ai and Storyfile, which also frame its services as of historical value – can be scheduled with the memories and the voice of the person to produce realistic holograms or chatbots with which family members or others can converse.
The desire to fill life and death is human innate. For millennia, religion and mysticism have offered paths for this – obscuring the lines of logic in favor of belief in eternal life.
But technology has its relatively recent history in an attempt to connect the living and the dead.
Just over a century ago, Thomas Edison announced that he had tried to invent a “apparatus” that would allow “personalities who left this land to communicate with us.“Known for his contribution to the telegraph, the incandescent bulb and the film, Edison said The American magazine that this device would not work with any “occult means” or “strange” but instead with “scientific methods”.
As science and technology have evolved, even the ways in which they try to transcend death. Where the XIX and the beginning of the twentieth century saw the rise of spiritualism and pseudoscientific attempts to communicate the thirst, the ghost sightings and the theoretical “spiritual” sightings of the thirst, the invention of these Avatar of Ai today, now we are entering a new era of techno-supritualism.
Machines already median most of our life and dictate many of our decisions. We need news and music algorithms. Targeted ads include our desires. Sleep tracing apps and smartwatch gamify our physical form. But until recently, the pain and death remained among the few aspects of modern life not totally included by the constant social drumbeat of optimization, efficiency and productivity.
As the so -called death technology industry deceives and the IA becomes more omnipresent, however, pain may not exist beyond the mix for a long time.
The artificial intelligence used for psychological well -being is already relatively mainstream. These tend to present themselves in the form of mental health or “companions” chatbots, such as reply, which some people use to create avatar on which they rely on emotional support. The latter wave of technology, however, has pain and loss specifically in her crossed hair.
Many companies that produce artificial intelligence avatar and chatbot have adopted the language of optimization, suggesting that their tools can help people “facilitate pain” or otherwise a better process loss by providing a possibility for post -mortem conversations and closure. These statements play in the defective but traditional notion according to which pain moves linearly or in discrete phases through which it can be predictably and cleanly progress.
Executively displayed on the Re Re website; Memory is a quote that attribute to Confucius: “If they don’t sleep for a significant loss, what else could your pain evoke?” The implication seems to be that only by reporting a loved person who died through his technology could he be able to suffer correctly.
The potential risks of artificial intelligence tools for mourning are significant, not least because the companies that produce them are guided by profit – encouraged to exploit the desires and disappointments that could be unhealthy for their users. A recent study From the University of Cambridge, for example, it has assessed the ethics of the “digital afterlife industry” and hypothesized that these companies could soon realize that there is even more money to earn by requiring people to pay subscription fees or watch ads to continue interacting with the foams of their loved ones could also have the suggestions sponsored by Deadbot, such as ordering the favorite food of the very popular service through a delivery service through a delivery service through a delivery service. specific.
Another possible dystopian scenario that Cambridge researchers imagined is a company that does not have (or refusing) to deactivate its “Deadbots”, which could lead to survivors to receive “unsolicited notifications, reminders and updates” and instill the feeling that they are “to be persecuted by the dead”.
This mixing of reality, fantasy and business is damage to mourning.
If the Victorian headquarters provided the temporary illusion of the ultraterrena communion, today, after life, it offers something even more insidious: an interactive and interactive discussion with the deaths that prevents or delaying a real yield of loss.
In some contexts, chatbots and avatar could be useful tools to develop a death, in particular if they are treated as spaces of reflection, such as diaries. But in our culture obsessed with the efficiency that encourages us to skip the unpleasant, painful and disordered aspects of life only because we think we can, a healthy use of these tools is possible only if accompanied by a firm understanding that robots or holograms are not basically real. The mysterious likelihood of many of these avatars complicates it and makes it more likely that their final result does not help people to elaborate pain, but rather allowing them to avoid it.
The more we use these tools to avoid, the greater their damage potential, disconnecting ourselves from our pain and common mourning to which our society should be struggle. And if we ever come to see the use of these tools as a necessary Part of the mourning, we must simply say it.
What will become popular these artificial intelligence tools for mourning is not immediately clear, but the only number of companies in competition to create and market them – mostly guided by industry in the United States, in China and South Korea – it is right to assume that they will become a significant part of our shared future.
What would it mean instead to stop and embrace the most clearly discouraged feelings that surround the loss? What would it mean to consider that, although efficiency and optimization can be useful on the market, have they not placed in matters of the heart?
While we enter a new era of techno-supritualism, the question will not be when the culture of optimization will come for pain, but rather as we will choose to face it when inevitably does it.
From the Spirit phone to Deadbot, there are and will always be attempted to connect technologically with the deceased. More worrying is that the chances of AI we have today represent only the tip of a huge iceberg. The near future will provide more and more realistic and seductive ways to ignore or entirely create our realities and isolation more and more in our pain.
As individuals, we may not be able to control technology progressions. What we can control is the way we face what is unpleasant and painful, embracing those feelings, also and above all at most.