For this part, I found more outside evidence that could go hand-in-hand with the evidence I received and read in class. I read over the articles, picked out particular quotes, and tried to tie them back to the questions that I wrote out. Previously, I started off with three big picture questions, but it was hard to formulate an essay with those questions, so I chose one and expanded on it.
Simple but Big-Picture Question:
- Is technology supporting and helping loved ones through grief or simply preventing them from moving on, which has been one of the common processes in life?
o What makes the loss of a loved one so hard to get over?
o Why are some people willing to communicate with AI, whereas there are some who refuse to?
o What makes this AI chat bot (grief bot) so unique from the old grieving methods: pictures, videos, talking to a gravestone?
o Are there therapeutic effects from chatting with a griefbot?
o How aware is everybody regarding AI that mimics their dead friends and/or family members message writing?
Main Themes:
- Memory, Grief, Death, Technology
Evidence:
Speak, Memory
o Kuyda had mixed feelings in the beginning, too. “Memorial bots – even the primitive ones that are possible using today’s technology – seemed both inevitable and dangerous. ‘It’s definitely the future – I’m always for the future,’ she said. ‘But is it really what’s beneficial for us? Is it letting go, by forcing you to actually feel everything? Or is it just having a dead person in your attic? Where is the line? Where are we? It screws with your brain.’” -> Should we purge ourselves of grief?
o “Modern life all but ensures that we leave behind vast digital archives – text messages, photos, post on social media – and we are only beginning to consider what role they should play in mourning.” However, this is also a question. This particular project uses the people’s social media, messaging, etc. However, there are many people who talk different than they text. Even though they are simply messaging the person, what happens if it shifts to something similar to Black Mirror, where people want to hear the person again? There will be obvious differences unless they can also use the dead’s Skype, Facetime, phone calls, etc. to create an AI that sounds either almost alike or exactly like the person.
o In this case, Roman’s death was abrupt, so many of his friends and family members are struck with sudden grief.
o A friend of Roman said that it was “provocative, and likely controversial.” Although he did contribute to the project, he said, “The question wasn’t about the technical possibility. It was: how is it going to feel emotionally?” This is an important point because everybody has that idea of wanting to talk to Roman, but when the product is finished and they do test it out, how will they feel? Will they move forward with their grief? Will they backtrack? Or will they never move on?
o Most people responded well, but there were several people who were disturbed by the project and didn’t want to interact with it. One said, “Unfortunately you rushed and everything came out half-baked. The execution – it’s some type of joke … Roman needs [a memorial,] but not this kind.” Another defended the project, “They continued Roman’s life and saved ours… It’s not virtual reality. This is a new reality, and we need to learn to build it and live in it.” Here we have two different perspectives on the outcome of the project. One who doesn’t support the project believes that a memorial is to respect the dead and give closure for those who are still living. Most likely because this project gives the living a false idea that Roman is still alive and technically still living in the past; however, another person sees this project as something that saved them from plummeting into a black hole.
o When people are messaging at the very beginning, they will be wary. They are more aware of what situation they are in, so some people’s feedbacks were that they were surprised at how similar it was to Roman. However, his dad pointed out that when he talked to the bot, it responded incorrectly; thus, it points out that people may talk to different people differently, such as the way one talks to their parents versus the way they talk to their friends.
o Therapeutic Effect: Confessions. Letting the bot listen to them, rather than actually talk to people
o Mazurenko actually felt guilty about her interaction with the bot because she became skeptical whether or not it was able to reflect the true Roman. Nonetheless, she concluded that it was similar to “just sending a message to heaven. For me it’s more about sending a message in a bottle than getting one in return.”
Strange and Mysterious History of the Oujia Board:
o The idea that people think the spirits or the dead are communicating with them is pointing out the people’s subconscious mind is more powerful than they think it is. At some point, grief, most likely, is clouding their judgement on the current situation – that their loved ones have passed away. Because of that, they believe that their loved ones are actually communicating with them. However, researchers have proved that the boards are powered by us – through ideometer, where “automatic muscular movements that take place without the conscious will or volition of the individual.” The time periods that the boards are released and spiked in their number of sales was around WWI, targeting those who desired to talk to dead because they missed them, and if they are able to hear or see a message from their loved ones, it may help them with their grief process.
Dying Young Woman:
o Dad’s beginning thoughts: “Dying is a part of life”
o Kim hoped that “her billions of interconnected neurons could be scanned, analyzed and converted into computer code that mimicked how they once worked.” Kim is taking a next step forward with the grief bot from the Speak, Memory article. Kim’s case is very particular because her life was taken away by a sad, illness. She hopes that one day, she will be part of a life-changing event.
o Preserving her Spirit: Grieving methods for Josh (Boyfriend) are a little unconventional. He leaves her voice messages, updating her on what has been happening. He hopes that if one day, she really does come back, she will be able to hear them. He writes on her Facebook page “Until (or unless) the day comes that Kim can be brought back … remember her, celebrate her, and emulate her resilience, so we can create the future of her dreams.”Her own father also leaves her messages, although he was initially opposed to her cutting her head off to preserve her brain. He most likely left messages despite the opposition, he wanted to let her see that he loved her and will always love her no matter what she becomes or how she will be altered.
Griefbot That Could Change How We Mourn:
o Grieving made easier: “website memorials, communal grieving across social media, online messenger services offering support” Grieving has been made accessible.
o Muhammad Ahmed created a Facebook Messenger AI of his dad. He called suck AI “griefbots” that can imitate “the deceased’s cadence, tone, and idiosyncrasies.” And he also thinks that these bots will make grieving an interactive one
o They gather their information from audio and video recordings, text messages, and transcripts of letters, so the program can imitate the dad. The purpose of creating such program is not only to grieve but also so that his future legacy can have to opportunity to communicate with him and get to know how great of a dad he was. He wants his daughter to form a connection with her grandfather. And Muhammad mentioned that people tell stories of their parents to their grandkids about their grandparents, and, regarding this, he wants storytelling to slowly become interactive. He wants it to become an interactive memorial
o “There are stages of grief … that technology has been able to interact with in the sense that allows people to reach out and get social support,” Pamela Rutledge… “In short term it might be that having these bots, [and] the ability to still make contract in a way that feels meaningful, would alleviate some of that initial distress”
o Stages of grief: denial, anger, bargaining, depression, and acceptance. Griefbots can help us with our grieving process because they can unlock memories and stories for people to remember and hear in order to help them through their grieving process.
o Facebook pages and Instagram pages can remain active or memorialized long after a person has died, “allowing users to post message on their walls and pictures.” Wendy Moncur said, “What we’re seeing is that conversation playing out on social media, rather than internally or someone standing at a graveside and having a chat with a headstone”
o “Continuing bonds – how people grieve with (or without) others. Facebook had to apologize because their “year in review” videos triggered painful memories about those who passed away. “It is impossible to guarantee the safety of griefbots from triggering painful memories without first ensuring the AI is programmed to understand the sensitive social conventions that come with grief”
o Concern -> para-social relationships where it is referring to “one-sided relationships in which one person puts a great deal of energy into while the other doesn’t even realize they exist.” If one person interacts with something, they end up developing a para-social relationship that can potentially prevent the person from grieving the loss of the person. That is the irrational side of the brain. The rational side of the brain would constantly remind you that the AI is not real – so cognitive override
o “But we’re more than the data we leave behind. In the near future, it’s unlikely AI will evolve enough to fully replace the emotional support humans can offer the bereaved.”
o “‘We are social creatures,’ Sheri Jacobson said. ‘We thrive in the company of others and in supporting one another… that’s why we all need to live in a community, we need friends, we need support. So to what extent griefbots will be able to replicate the human support level remains to be seen.’”
New Technology is Forcing Us:
o They cannot be used to substitute real people. At a certain point, it is talking to god, or “imagining we’re talking to someone we’ve lost, or even talking to a therapist.”
o In order to move on, people need to talk about it. People need to embrace grief. The only way to move on is by experiencing grief, and in order to experience grief is to talk about it, endure the pain. “The hope is that chatbots don’t undermine the importance of human connection and support for those who are grieving; that the vivid and often uncomfortable emotional labor of caring for the bereaved is not wholly outsourced to bots. After all, death may soon be the most apparent thing differentiating humans from advancing AI, and distancing ourselves from its stark reality doesn’t seem like a prescient way to improve our relationship with the meaning of life”
o Chatbots may actually “normalize conversations about death and the intensity of sorrow,” so people who are grieving won’t be afraid to go through that process
How Technology is Changing the Way We Grieve
o Dad also passed away but he continuously texted him because he hoped to text him in hopes that his dad can read what his son forgot to tell him
o Grief shaped the narrator, and he realized that the memory of his dad has become a part of him. He said that “Technology, like grief, encourages imaginative thinking”
Dadbot
o The dadbot, although he can say the anecdotes about the family, it is not the same. The AI is only a shadow of the real person
o The robot never said “I love you” until the robot was programmed to say, “I can’t love you because I’m a bot, but I know your real father love you very much”
Images

The idea of using a mechanical arm that resembles a real human arm and having it hold a tissue or handkerchief is interesting because it is to symbolize the Griefbots/Chatbots. However, regardless of what they are really called, we, at the moment, are not interacting with a robot but with a messaging program. So this picture is a metaphor for the AI talking to the grieving ones. The messaging would be the tissue and the arm could represent the program itself or the programmer.
コメント