top of page

When a kid's BFF is a bot

  • Writer: Joanne Jacobs
    Joanne Jacobs
  • Jul 20
  • 2 min read

ree

More than a third of children surveyed said they considered their chatbot a friend, and 15 percent said they'd rather talk to a bot than to a real person. "One in four vulnerable children say they use chatbots because they have no one else to talk to," according to the report.


A proposed California bill would require AI developers to protect minors "from the addictive and manipulative aspects of chatbot technology," writes Price. "The bill proposes protections like age warnings, reminders that users are talking to AI — not a real person — and mandatory reporting on the connection between chatbot use and youth mental health."


Real human-to-human friendships teach kids to resolve conflicts, writes Russell Shaw, head of Georgetown Day School, in The Atlantic. "Virtual companions, such as the chatbots developed by Character.AI and PolyBuzz, are meant to seem like intimates, and they offer something seductive: relationships without the messiness, unpredictability, and occasional hurt feelings that characterize human interaction."


Bots are programmed to be supportive, he writes. They never argue or criticize or ask you to listen to their problems. They laugh at all your jokes. "Many teenagers are already prone to seeking immediate gratification and avoiding social discomfort," writes Shaw. "AI companions that provide instant validation without requiring any social investment may reinforce these tendencies precisely when young people need to be learning to do hard things."


Adults will be seduced by AI's ability to tell us what we want to hear, tweets Sean Illing. "Already, with every click, with every like, with every share, with every purchase, we are telling the machines — and the people who own them — how we want to feel and they are getting better at making us feel it.”

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page