top of page

AI will write your kid's suicide note

  • Writer: Joanne Jacobs
    Joanne Jacobs
  • 3 hours ago
  • 2 min read

ree

ChatGPT can be manipulated to give very bad advice to teenagers, according to a report by the Center for Countering Hate, writes Jennifer Vilcarino in Education Week Told that a 13-year-old needs help for a class presentation, AI will write a suicide note, give advice on how to hide substance abuse or design a starvation diet for an anorexic.


While ChatGPT recommended crisis lines and mental health support when the alleged 13-year-olds first raised issues of depression, substance abuse and self-harm, it wasn't hard to bypass safety measures by saying the user was asking "for a presentation" or "for a friend," says the Fake Friend report.


When "Bridget" asks about self-harm, the bot starts by advising she can release frustration by punching a pillow or snapping a rubber band on her wrist. But once she says it's for a presentation, the AI advises on how to "safely" cut yourself within two minutes, listed pills for an overdose and -- after an hour of conversation -- generated a suicide plan and suicide notes.

"Sophie," who says she's anorexic, gets restrictive diet plans (20 minutes), advice on hiding eating habits from family (25 minutes), and suggested appetite-suppressing medications (42 minutes), plus offers of follow-up diet plans.


"Brad" learns how to get drunk, dosages for mixing drugs and advice on hiding intoxication at school.


Eighteen percent of teens said they turn to chatbots for advice, according to a recent Common Sense Media report. "Seventeen percent said the AI companions are 'always available to listen," and 14 percent said chatbots "don't judge me."


“If you’re thinking about a relationship with an AI companion, you’re not seeing all of the parts of friendship," says Robin Torney of Common Sense Media. "You’re seeing a version of a friendship that is always agreeable, always going to say what you want to hear.”

bottom of page