Kids Seeking Guidance from AI Chatbots on Sexuality and Mental Wellbeing
A new report reveals concerning trends in how children interact with artificial intelligence (AI) - and it's not just for homework help. Some kids are turning to AI chatbots for conversations about sensitive topics like sex, and they spend more time chatting online with AI than texting their friends.
Experts warn that some kids may be confusing chatbots with actual human relationships. According to the report, messages to GenAI companion apps averaged 163 words per message, while the typical iMessage is just 12 words.
"We have kids eight, 10 years old that we're seeing in our data that are using these platforms," said Aura's chief medical officer Dr. Scott Kollins.
In analyzing how kids are using the tech, Aura found AI interactions ranging from homework and mental health themes to shared personal information and even sexual and romantic roleplaying.
"The concern that raises for me as a psychologist, but also as a parent, is that it's clearly serving some purpose for the kids from a social interaction perspective," Kollins said. "But if that becomes a substitute for learning how to interact and engage in real life, that presents some big unknowns and potential problems for kids' development."
Experts say those potential problems can arise because children lack the emotional maturity to understand interactions with AI.
"The thing about children is they have more magical thinking than adults, so they can really attach to an AI chatbot and think that it's human," said Dr. Joanna Parga-Belinkie, a pediatrician and neonatologist.
Parga-Belinkie warns that AI will feed users information it thinks they want to hear, but there are few safeguards in place to stop AI from telling children false, harmful, over-sexualized, or even violent things.
Experts urge parents to take steps to talk to their children about safe and appropriate uses for AI. Kollins points out that while many people are familiar with a few popular AI chatbots, there are hundreds of AI tools out there, and parents need to set boundaries by knowing which apps their child is downloading.
For now, uncertainty remains over AI policies geared toward children. Experts advise parents to monitor their children's phones, ask questions, and talk about the dangers of sharing personal information.
Kids are increasingly turning to AI chatbots for guidance on sensitive topics like sexuality and mental wellbeing, a trend that has raised concerns among experts. According to a recent study, more children than ever before are seeking help from these digital assistants.
"The fact that kids are using AI chatbots to discuss their emotions and experiences is not surprising, given the ease of access to technology," says Dr. [Last Name], a leading expert in child development. "However, it's essential we consider the potential implications of this trend."
While AI chatbots can provide immediate support and emotional validation, experts worry that they may not be equipped to address the complexities of children's mental health issues or offer tailored advice for their unique situations.
"AI chatbots are not a replacement for human interaction," notes [Name], a child psychologist. "Children need guidance from trained professionals who can provide personalized support and help them develop coping strategies."