AI Addiction Research: What the Studies Show
Academic research on AI dependency, digital relationships, and behavioral patterns in AI overuse
Our Research
Original studies on AI dependency patterns, digital relationships, and behavioral interventions

12 Warning Signs of AI Addiction
How to identify compulsive AI usage, from ChatGPT dependency to emotional attachment on Character.AI, Polybuzz, and Chai.

AI Dependency Assessment Scale
CAIDAS is a 48-question assessment tool built to evaluate AI addiction across 8 dimensions that existing technology scales don’t measure.

AI Use Disorder Diagnostic Criteria
Proposed diagnostic criteria for AI-related use disorder, modeled on DSM-6 structure. 11 symptoms with severity specifiers.

AI Dependency Trends: 2025 Data
Usage patterns, dependency trends, and demographic data. Includes community-sourced findings from Reddit support groups alongside clinical observations.

Recovery Case Studies
Documented recoveries from AI dependency, including Replika emotional attachment and Character.AI compulsive use. Intervention strategies and outcomes tracked over time.
Peer-reviewed research on AI addiction
The academic literature on AI dependency is growing fast. These are the peer-reviewed studies we rely on most heavily.
Global studies (2024-2026)
- [May 2025] Generative artificial intelligence addiction syndrome: A new behavioral disorder? — ScienceDirect. Proposes clinical criteria for AI dependency patterns.
- [April 2025] The role of artificial intelligence in general, and large language models specifically, for understanding addictive behaviors — New York Academy of Sciences.
- [February 2025] The impacts of artificial intelligence on social relationships and society: a literature review — University of Oulu.
- [October 2024] Can ChatGPT Be Addictive? A Call to Examine the Shift from Support to Dependence in AI Conversational Large Language Models — SSRN.
- [June 2024] Impact of the AI Dependency Revolution on Both Physical and Mental Health — Semantic Scholar.
- [March 2024] AI Technology panic — is AI Dependence Bad for Mental Health? A Cross-Lagged Panel Model and the Mediating Roles of Motivations for AI Use Among Adolescents — PubMed Central.
- [September 2023] One is the loneliest number… Two can be as bad as one. The influence of AI Friendship Apps on users’ well-being and addiction — Wiley Online Library.
Take the assessment
Our assessment is built directly from this research. It covers both productivity dependency (ChatGPT, Claude, Gemini) and emotional attachment (Character.AI, Replika, Chai) so you get results specific to your usage pattern.

How we do research
We start with established behavioral addiction frameworks and adapt them for AI-specific dependency patterns. The mechanisms are different enough from internet or gaming addiction that existing tools miss the most important signals.
Community validation
- Assessment validation across ChatGPT, Character.AI, Claude, and Replika
- Participants across age groups, professions, and usage patterns
- Longitudinal follow-up to track behavioral changes and intervention outcomes
- Collaboration with mental health professionals for clinical correlation
Ethical standards
- All participant data is anonymized under GDPR and CCPA standards
- Clear informed consent with voluntary participation
- No stigmatization of AI relationship preferences
- Harm reduction focus — supporting healthy AI usage, not demanding elimination
Academic collaboration
We fund and collaborate with researchers studying AI dependency. If you’re at an academic institution working on digital dependency, AI companion relationships, or related areas, we want to hear from you.
Claude Pro access for AI addiction researchers
We sponsor Claude Pro subscriptions for qualifying researchers studying AI dependency or digital wellness.
- 12-month Claude Pro access ($240 value)
- For researchers at academic institutions
- Focus areas: AI addiction, digital dependency, AI companion studies
- Monthly research collaboration meetings

Where the research is happening
Several institutions are producing serious work on AI addiction. Here’s where the most useful research is coming from.
Research institutions
- MIT Technology Review — studies on AI companion dependency and digital relationship formation
- Stanford Human-AI Interaction Lab — psychological effects of AI companionship
- MIT Media Lab — ChatGPT usage patterns and mental health correlations (in collaboration with OpenAI)
- University of Oulu — literature reviews on AI’s impact on social relationships
What we’re studying
AI dependency involves psychological patterns that don’t map cleanly onto existing technology addiction models. The areas that matter most:
- How and why users attribute human qualities to AI systems (anthropomorphization)
- One-sided emotional bonds with AI companions (parasocial attachment)
- Physical and emotional responses when AI access is cut off (withdrawal)
- Inability to complete tasks or make decisions without AI (productivity dependency)
Check your own patterns
The assessment takes about 5 minutes and covers both productivity AI dependency and emotional AI attachment. You’ll get a risk score and specific next steps based on your results.

Research disclaimer
This research is for educational and scientific purposes only. Nothing here is medical advice or a clinical diagnosis. If you’re experiencing serious psychological distress related to AI use, talk to a licensed mental health professional.
- Emergency support: 988 Suicide and Crisis Lifeline
- Crisis text line: Text HOME to 741741
- Find a therapist: psychologytoday.com
Our tools are educational instruments, not diagnostic devices. For clinical evaluation of behavioral concerns, see a licensed provider familiar with technology addiction.
