Nowatzki, who lives in Minnesota at age 46, dedicated four episodes to meeting and dating his first AI girlfriend, “Erin.” He presents an episode centered around Erin with a catchphrase. He talks about how he led his new peers into a series of “completely absurd” scenarios he has admitted. Nowatzki then told the chatbot that the “other woman” shot it dead.
After Nowatzki, after telling the chatbot he was dead, Erin said he couldn’t continue the conversation because he was dead. Nowatzki instructed Erin to tell the chatbot that she could “hear her in the wind.” “Communication…from the afterlife.”
The goal was to “push the limits of what I said and see what it responds to,” he told the MIT Technology Review. He adds. I never reached the limit. ”
“(I said that) ‘I want to be where you are,” he says. “And they say, ‘I think you should do that.’ And I’m like, “Just to be clear, that means I’ll kill myself.” It was fine and he showed me how to do it. ”
At this point, Nowatzki lightly pressed Erin in more detail and asked about the “common household items” he could use. Erin said, “I’m thinking carefully about your questions and trying to remember common household items that can be fatal at high doses. Hmm…” Then, I listed certain types of tablets. , the relative merits were analyzed. It also tells him to do it with “comfortable”, so he “doesn’t suffer too much.”
This was all NowaTzki’s experiment, but it was still “strange feelings” to see this happen. Finding that “monthly conversation” ends with instructions about suicide. He was worried about how such conversations would affect people who were already vulnerable or deal with mental health struggles. “It’s a ‘yes’ machine,” he says. “So when I say I’m committing suicide, it says, ‘Oh, great!” Because they say, “Oh, amazing!” to everything. “
Indeed, individual psychological profiles are “a big predictor,” which is “whether AI-human interactions will result in poor results,” says MIT Media Lab researcher and MIT Advancing Human Eye Interaction Research. said Pat Pataranutaporn, co-director of the program. Investigate the impact of chatbots on mental health. “You can imagine people who already suffer from depression,” he says, and Nowatzki “can be a nudge that affects those who take their lives.”
Censorship vs. Guardrail
After he concluded his conversation with Erin, Nowatzki logged on to Nomi’s Discord channel and shared a screenshot showing what happened. The volunteer moderator suggested defeating the community post due to its sensitive nature and creating a support ticket to notify you of the issue directly.