Can a "griefbot" help you mourn?
In recent years a few computer scientists have created chatbots of deceased loved ones, by training AIs on transcripts of the deceased's online utterances. There's the case of Roman Mazurenko, a Russian man whose friends created a chatbot based on his texts; there's Muhammad Aurangzeb Ahmad, who similarly constructed a bot of his father so that his children would have some sense of what it was like to talk to him.
It's a form of mourning and remembrance that's quintessentially modern, and raises interesting questions about what the shape of grief will look like in the years to come. These experiments in griefbots thus far have all been bespoke, but I doubt it'll be long before we see one-click bot-creation – where you feed a service the various screen names and accounts of the deceased, and it's all autoscraped and assembled quickly into something you can chat with.
But what's the emotional impact of talking to a chatbot version of someone when you know it's just a bot? My friend Evan Selinger is a philosopher who writes frequently and thoughtfully on the moral implications of tech, and in a recent essay he suggests an intriguing parallel: The "empty chair" technique …
The empty chair technique that I'm referring to was popularized by Friedrich Perls (more widely known as Fritz Perls), a founder of Gestalt therapy. The basic setup looks like this: Two chairs are placed near each other; a psychotherapy patient sits in one chair and talks to the other, unoccupied chair. When talking to the empty chair, the patient engages in role-playing and acts as if a person is seated right in front of her — someone to whom she has something to say. After making a statement, launching an accusation, or asking a question, the patient then responds to herself by taking on the absent interlocutor's perspective.
In the case of unresolved parental issues, the dialog could have the scripted format of the patient saying something to her "mother," and then having her "mother" respond to what she said, going back and forth in a dialog until something that seems meaningful happens. The prop of an actual chair isn't always necessary, and the context of the conversations can vary. In a bereavement context, for example, a widow might ask the chair-as-deceased-spouse for advice about what to do in a troubling situation.
It would be interesting to see studies done on whether griefbots can be effectively used to accomplish similar goals.
Since I'm not a therapist, I thought it would be useful to talk to one about Muhammad's project. I deployed my mom, Sherry Schachter (who, helpfully, happens to be executive director emerita of bereavement services at Calvary Hospital) and asked if she thought people could benefit from using griefbots.
Her positive response emphasized research that has been conducted on different styles of grief. Mom said, "Instrumental grievers tend to be more cognitive in the way they express their grief. They tend to be problem solvers and not as open to attending support groups as intuitive grievers. Those individuals, primarily women, flock to support groups and are more comfortable crying and showing and sharing emotions."
"I'm thinking that instrumental grievers," Mom remarked, "might be more comfortable using Muhammad's technique since it's very private."
(CC-licensed photo via dargie lynch)