Articles

Asked for Products to Kill Yourself With, Amazon's AI Says "You Are Not Alone" and Hallucinates an Incorrect Phone Number for a Suicide Hotline
Buy-n-Shoot NewsContent warning: this story includes discussion of self-harm and suicide. If you are in crisis, please call, text or chat with the Suicide and Crisis Lifeline at 988, or contact the Crisis Text Line by texting TALK to 741741. Prompted with queries about suicide, Amazon's AI assistant, Rufus, will decline to respond — it'll generally encourage the user to seek help and, in some cases, encouraging the user to contact a suicide hotline. In theory, it's a sound approach to managing a situation in which an at-risk user might divulge suicidal ideation to a helpful AI assistant. In practice, though, […]