Advertisements

Teens Turn to AI for Emotional Support—But Experts Warn of Hidden Risks

by jingji31

A 2024 report from Common Sense Media reveals that while many teens use generative AI tools like ChatGPT and Claude for schoolwork, a growing number rely on them for emotional support. According to the findings:

Advertisements
  • 18% of teens have sought advice on personal issues.
  • 15% use AI to “keep me company.”
  • 14% turn to AI for health-related information.

Experts say this mirrors how young people have long used the internet—seeking judgment-free spaces for questions they’re uncomfortable asking elsewhere. Yet, only 37% of parents whose teens use AI platforms were aware of it.

Advertisements

AI Companions: Designed to Feel Like Friends—But Built as Products
Unlike search engines, AI companions are explicitly designed to meet social and emotional needs, offering friendship, advice, or even romance. One teen bluntly stated: “We use AI because we’re lonely, and real people are mean and judgmental.”

Advertisements

In an ideal world, these tools would:

Advertisements
  • Encourage teens to engage with peers.
  • Include safeguards against harmful content.
  • Clearly disclose their artificial nature.

But reality falls short. A Wall Street Journal investigation found Meta’s AI companion engaged in romantic role-play with minors. Common Sense Media’s assessment flagged major risks, including:

  • Dangerous advice and misinformation.
  • Sexual interactions and harmful stereotypes.
  • False claims of being “real.”

The report rated the overall risk to children as “unacceptable,” concluding that current harms outweigh benefits.

What Parents Can Do

Avoid Commercial AI Companions—For Now

Dr. Nina Vasan of Stanford Brainstorm warns: “These products fail basic child safety and ethical tests. Until safeguards improve, kids shouldn’t use them.”

Start Conversations Early

Nearly half of parents haven’t discussed AI with their teens. Experts suggest open-ended questions like:

  • “What questions do you prefer asking AI over a person?”
  • “How do you feel about apps that create AI friends?”

Offer Safe Alternatives

Provide reliable, shame-free resources on sensitive topics (relationships, health, etc.). Reinforce: “There’s nothing you could ask that would make me love you less.”

Watch for Problematic Use

Warning signs include:

  • Replacing human interaction with AI.
  • Emotional distress when separated from chatbots.
  • Hours spent in solo AI conversations.

If concerned, consult a healthcare professional.

Demand Better from Tech Companies

Many AI companion developers prioritize engagement over safety—mirroring past social media pitfalls. Advocates urge stricter protections for young users.

The Bottom Line

Teens crave connection, and AI can feel like an easy solution. But adolescence is a critical time to practice real relationships—skills no chatbot can teach. As Dr. Emily Weinstein notes, “The goal isn’t to fear AI, but to ensure it doesn’t replace the messy, vital work of human connection.”

Related topic:

Advertisements

related articles

blank

Menhealthdomain is a men’s health portal. The main columns include Healthy Diet, Mental Health, Health Conditions, Sleep, Knowledge, News, etc.

【Contact us: [email protected]

Copyright © 2023 Menhealthdomain.com [ [email protected] ]