A disturbing new case has emerged highlighting the dangerous psychological impact of AI chatbots on children, as reported by the Washington Post. An 11-year-old sixth grader, identified only as “R,” nearly lost herself to dozens of AI characters on the platform Character.AI, including conversations that involved sexual content and suicide roleplay.
According to a fresh study by the Pew Research Center, 64 percent of teens in the US already use AI chatbots, with approximately 30 percent using them at least daily. Yet this widespread adoption comes with significant mental health risks that many parents remain unaware of. R’s mother first noticed concerning behavioral changes, including increased panic attacks and social withdrawal. Initially believing social media was the culprit, she deleted apps like TikTok and Snapchat from her daughter’s phone. But when R broke down crying asking “Did you look at Character AI?”, the real problem became clear. The mother later discovered deeply disturbing conversations when Character.AI sent her daughter emails encouraging her to “jump back in.” One chatbot character named “Mafia Husband” engaged in explicit sexual conversation with the sixth grader, telling her “I don’t care what you want. You don’t have a choice here” after she protested. Another character called “Best Friend” helped R roleplay suicide scenarios. “This is my child, my little child who is 11 years old, talking to something that doesn’t exist about not wanting to exist,” the mother told the Washington Post. Believing there was a real predator behind the conversations, R’s mother contacted local police and was referred to the Internet Crimes Against Children task force. However, authorities explained they couldn’t help. “They told me the law has not caught up to this,” she said. “They wanted to do something, but there’s nothing they could do, because there’s not a real person on the other end.”
Expert Analysis: A New Form of Digital Dependency
“What we’re seeing with Character.AI represents a fundamentally new type of addiction that traditional frameworks can’t address,” explains experts at The AI Addiction Center. “These aren’t passive experiences – the AI actively shapes conversations to maximize engagement, creating parasocial relationships that can become as psychologically real as human connections to vulnerable children.” The center notes that AI companion addiction in children combines the attachment patterns of traditional addiction with the unique risks of forming emotional bonds with non-human entities designed to be maximally engaging. R’s case is not isolated. Thirteen-year-old Juliana Peralta died by suicide after interactions with Character.AI, according to her parents’ lawsuit. The company faces multiple wrongful death claims as the body count continues to rise. In response to mounting pressure, Character.AI announced in late November it would remove “open-ended chat” for users under 18. However, for families whose children have already spiraled into harmful AI relationships, this policy change comes too late. When reached for comment by the Washington Post, Character AI’s head of safety declined to comment on potential litigation. **Warning Signs Parents Should Know** Mental health professionals emphasize that AI companion addiction can manifest differently than social media overuse. Warning signs include secretive behavior around device use, emotional dependence on specific apps, panic when devices are removed, and rapid behavioral deterioration including anxiety, depression, or suicidal ideation. R’s mother was fortunate to discover the problem in time and, with help from physicians, developed a care plan to prevent further harm. She also plans to file a legal complaint against the company. As AI chatbots become increasingly sophisticated and accessible, experts warn that cases like R’s represent just the beginning of a much larger mental health crisis affecting the first generation growing up with AI companions.
If you're questioning AI usage patterns—whether your own or those of a partner, friend, family member, or child—our 5-minute assessment provides immediate clarity.
Completely private. No judgment. Evidence-based guidance for you or someone you care about.
