Introduction to Griefbots
Often, one of the longings of people who have suffered a loss is to talk again, even for a moment, with their loved one. Now, new technologies are making it possible “virtually”, thanks to artificial intelligence.
Is this a mere new form of “digital memory” or a technology surrounded by serious ethical questions?
Using AI, drones and robots can provide better wirehouse management. Even though chatbots are having very little human touch while communicating, they can supply more details about what customers want.
Being a seller, you know very well that a buyer can not be educated. Customers get tensed, annoyed each and every time if their questions are not answered. Chatbots can greatly reduce the human effort to answer all queries from their knowledgebase.
In UK(United Kingdom) almost 50% of online consumers are happy to talk to chatbots. Closely 40% of them said that they would use one website if they have one chatbot.
From the customer’s perspective, this is not important to understand that they are chatting to. As long as they are getting what they want, they are happy.
Chatbots are programmed to analyze shoppers’ data to show preferences, addresses, frequently questioned queries on your websites.
The “duel robots” and the fingerprint
The so-called grief bots (literally dueling robots) are chatbots – computer programs based on artificial intelligence (AI) and capable of conversing with humans – made up of the “fingerprint” that the loved one has left: a legacy of Social media posts, videos, photos, emails, and text messages that feed an artificial neural network.
Together, they allow you to “imitate” the style and way of thinking of the person who has passed away. In this way, her loved ones can continue to chat with her virtually after her death.
Do griefbots help you get over the loss of a loved one?
The main argument for the creation of this new technology is to offer an important source of support to people in grief.
This was one of the reasons why, independently of each other, researchersEugenia Kuyda and Muhammad Ahmad began developing the griefbots. The idea came to them when they lost their best friend and their father, respectively.
The authors present griefbots as a modern take on grief rituals and classic forms of remembrance, such as a funeral or a photo album.
From this point of view, griefbots would play a positive psychological role in the grieving process. After all, they allow people to interact in a more sophisticated way with the memory of the dead person, commemorating their life and helping to keep their memory alive.
Ahmad maintains that his father’s grief but is directly inspired by his wish for his children to meet their grandfather.
Thus, it offers an interactive and direct experience that would go beyond the classic stories told from generation to generation about who this was.
For her part, Kuyda emphasizes that the chat with her best friend after his death helped her to talk about the subject and allowed her to discover aspects of herself that she did not know.
Both programmers maintain that conversations with their deceased loved ones helped put words to the feelings that overwhelmed them at the time of loss. Also, to freely express your fears and concerns.
In general, something that they would not have dared to do with someone “real” for fear of being judged.
What ethical risks does this robot-human interaction entail?
However, other authors warn of the ethical risks that robot-human interaction could imply through this type of technological tools.
In addition to the privacy of the dead person and the ownership and use of their “fingerprint” (for purposes not necessarily consented or desired), there is the question of how this technology impacts on bereaved family or friends.
One of the possible consequences, contrary to what its creators maintain, would be making it difficult for them to advance in their lives and adapt little by little to a world without the presence of the loved one.
It could happen that, by focusing their attention on the virtual interaction with the deceased, the bereaved person began to isolate himself socially.
In addition, it must be borne in mind that this “virtual other” is built from the history of conversations held by the deceased person. That is, the chatbot uses your past experiences to predict future responses.
This means that answers can be given that does not fit with what would have been expected of the loved one. Either based on the new conditions of the present, which would have required more adapted reactions; or to what we knew or thought we knew about her.
Keep in mind that such tools are based on your total fingerprint. This includes, for example, all your conversations with third parties.
Therefore, also different possible “I’s” or public personalities, depending on the context and/or the people with whom the interaction occurred.
In other words, the use of these tools could make us “discover” facets of the loved one that we did not know and that perhaps we preferred not to know.
All this would affect our image of the person, thus forgetting that the chatbot is nothing more than that, a robot.
End-of-life chatbots and important decision making
At present, chatbots are not only being applied to accompany grief: there is also another type intended to counsel people at the end of their life. Your objective, in this case, is to accompany them in the preparation of their last wishes.
They are also intended to help palliative care patients reduce anxiety about death. For example, promoting behaviors that mitigate stress or enabling discussion on essential spiritual issues.
The transformation of the experience of mourning and end of life by new technologies
Ultimately, new technologies are transforming our experience of grief. Even our way of understanding what death is. All of this raises urgent bioethical questions about the psychological and social implications derived from its implementation.
Critical reflection should be developed on the possible impact that all this could have on the bereaved.
Also, on the psychological function of rituals and the so-called “continuous ties” with the loved one, a key issue in clinical psychology.
Without forgetting to open a debate on the possible values or interests that sustain this type of technology, as well as the cultural conceptions about life and death that could be implicit in its use, a subject in which ethics is essential.