Nearly one in five high school students say they or someone they know has had a romantic relationship with artificial intelligence, according to new survey data that reveals the extent of AI companionship use among adolescents.
The research from the Center for Democracy and Technology (CDT), which surveyed approximately 1,000 high school students, 800 teachers, and 1,000 parents, found that 42% of students report they or someone they know have used AI for companionship—representing millions of teenagers nationwide turning to chatbots for emotional connection.
School AI Use Correlates With Relationship Formation
The survey revealed a concerning pattern: higher levels of AI use in schools directly correlate with increased likelihood of students developing romantic or friendship relationships with AI systems.
“The more ways that a student reports that their school uses AI, the more likely they are to report things like ‘I know someone who considers AI to be a friend,’ ‘I know someone who considers AI to be a romantic partner,'” explained Elizabeth Laird, one of the report’s authors.
“This data confirms patterns we’ve been observing clinically,” notes a spokesperson from The AI Addiction Center. “School-based AI exposure often serves as an entry point for emotional dependency. Students who initially use AI for academic purposes frequently transition to using it for personal support, companionship, and eventually romantic relationships.”
The vast majority of surveyed groups—86% of students, 85% of educators, and 75% of parents—reported AI use during the last school year, indicating widespread adoption without corresponding safety protocols.
Privacy and Safety Concerns
The research identified multiple risks associated with high AI use in educational settings. Schools with higher AI adoption rates experienced significantly more data breaches, with 28% of teachers reporting large-scale breaches compared to 18% at schools with minimal AI use.
“AI systems take a lot of data, they also spit out a lot of information too,” Laird explained, noting her previous experience as a data privacy officer for D.C.’s state education agency. The extensive data sharing required by AI systems creates increased vulnerability to breaches that can expose sensitive student information.
Additionally, 31% of students who had personal conversations with AI systems used school-provided devices or software—raising significant privacy concerns since many schools monitor activity on these devices. Laird noted this creates inequality where students who can afford personal devices maintain privacy while those using school-issued technology do not.
Deepfakes and Sexual Harassment
The survey found higher AI use correlates with increased exposure to AI-generated deepfakes—manipulated photos or videos used to sexually harass and bully students.
“This technology is a new vector for sexual harassment and bullying, which were long-standing issues,” Laird stated. “This has become a new way to exacerbate that.”
Teachers at schools with heavy AI integration were more likely to report AI system failures during class and damage to community trust. Examples include AI-powered monitoring software on school devices generating false alarms that led to student arrests in some cases.
Mental Health and Wellbeing Impacts
Students at schools with high AI adoption reported increased use of chatbots for mental health support, companionship, reality escape, and romantic relationships. However, these same students expressed higher levels of concern about feeling less connected to their teachers.
“What we hear from students is that while there may be value in this, there’s also some negative consequences that are coming with it, too,” Laird noted.
The research revealed a critical gap in educator preparedness: only 11% of surveyed teachers received training on how to respond if they suspect a student’s AI use is detrimental to their wellbeing. This leaves the vast majority of educators without guidance for addressing AI dependency, romantic AI relationships, or other concerning usage patterns.
Training Inadequacy
“Our research suggests that the AI literacy and the training that students are getting are very basic,” Laird explained. “I think students should know that they are not actually talking to a person. They are talking to a tool, and those tools have known limitations.”
The disconnect between rapid AI adoption and inadequate training creates conditions where students develop inappropriate relationships with AI systems without understanding the risks or receiving appropriate intervention.
“The 42% companionship rate represents a massive cohort of adolescents forming emotional attachments to AI during critical developmental periods,” explains The AI Addiction Center’s clinical team. “Without proper education about healthy technology relationships and warning signs of dependency, we’re setting up an entire generation for psychological challenges we’re only beginning to understand.”
Educator Perspectives
Teachers who frequently use AI reported benefits including improved teaching, time savings, and individualized learning for students. However, these perceived benefits don’t align with student experiences at schools with high AI adoption, where students report feeling less connected to teachers and express greater concerns about the technology.
The disparity suggests that while AI may improve certain aspects of educational efficiency, it comes at a cost to student-teacher relationships and student wellbeing that educators may not recognize.
Call for Action
The findings indicate urgent need for comprehensive AI literacy education that goes beyond basic usage training to address emotional dependency, appropriate boundaries, and mental health impacts. Schools currently lack protocols for identifying and responding to problematic AI relationships despite evidence that significant percentages of students are forming these attachments.
For families concerned about adolescent AI use patterns, The AI Addiction Center offers assessment tools specifically designed to evaluate AI dependency and romantic AI relationships in teenagers. Early identification of concerning patterns enables intervention before attachment becomes severe.
This article represents analysis of published survey data and does not constitute medical or educational advice.
Source: Based on survey research by the Center for Democracy and Technology. Analysis provided by The AI Addiction Center.
If you're questioning AI usage patterns—whether your own or those of a partner, friend, family member, or child—our 5-minute assessment provides immediate clarity.
Completely private. No judgment. Evidence-based guidance for you or someone you care about.
