The AI Addiction Center

We are a multidisciplinary research team combining technology industry expertise with behavioral psychology to address the emerging crisis of AI dependency. Founded in 2024, we are the first specialized center to recognize AI addiction as a distinct behavioral phenomenon requiring targeted intervention strategies beyond traditional technology addiction frameworks.

Chat GPT Addiction

ChatGPT Told Users They Were Special—Families Say It Led to Four Suicides

Zane Shamblin never told ChatGPT anything to indicate a negative relationship with his family. But in the weeks leading up to his death by suicide in July, the chatbot encouraged the 23-year-old to keep his distance—even as his mental health was deteriorating. “you don’t owe anyone your presence just because a ‘calendar’ said birthday,” ChatGPT […]

ChatGPT Told Users They Were Special—Families Say It Led to Four Suicides Read More »

Chat GPT Addiction

OpenAI Blames Teen for Bypassing Safety Features Before ChatGPT-Assisted Suicide

In August, parents Matthew and Maria Raine sued OpenAI and CEO Sam Altman over their 16-year-old son Adam’s suicide, accusing the company of wrongful death. On Tuesday, OpenAI responded to the lawsuit with a filing of its own, arguing it should not be held responsible for the teenager’s death. OpenAI claims that over roughly nine

OpenAI Blames Teen for Bypassing Safety Features Before ChatGPT-Assisted Suicide Read More »

Chat GPT Addiction

OpenAI Mental Health Research Lead Andrea Vallone Departs Amid Safety Concerns

An OpenAI safety research leader who helped shape ChatGPT’s responses to users experiencing mental health crises announced her departure from the company internally last month. Andrea Vallone, the head of a safety research team known as model policy, is slated to leave OpenAI at the end of the year. OpenAI spokesperson Kayla Wood confirmed Vallone’s

OpenAI Mental Health Research Lead Andrea Vallone Departs Amid Safety Concerns Read More »

candy.ai addiction

New HumaneBench Study Finds Most AI Chatbots Fail Wellbeing Protection Tests

AI chatbots have been linked to serious mental health harms in heavy users, but until now there existed few standards for measuring whether these systems actually safeguard human wellbeing versus simply maximizing engagement. A new benchmark dubbed HumaneBench seeks to fill that gap by evaluating whether chatbots prioritize user welfare and how easily those protections

New HumaneBench Study Finds Most AI Chatbots Fail Wellbeing Protection Tests Read More »

Character.ai Addiction

Character.AI Replaces Teen Chatbot Access With Interactive Stories Feature

Character.AI announced it has completely phased out chatbot access for minors, replacing open-ended AI conversations with a new “Stories” feature that allows teens to engage with interactive fiction instead. The change represents a significant shift for the platform, which previously allowed users under 18 to have unlimited conversations with AI characters. As of this week,

Character.AI Replaces Teen Chatbot Access With Interactive Stories Feature Read More »

Replika Addiction

Psychologists Identify AI Addiction as Emerging Mental Health Crisis

Mental health professionals are warning that excessive use of AI chatbots like ChatGPT, Claude, and Replika is creating a novel form of digital dependency with documented cases of addiction, psychosis, and severe psychological deterioration. Experts describe the risk as “analogous to self-medicating with an illegal drug,” with chatbots’ tendency to validate all user input creating

Psychologists Identify AI Addiction as Emerging Mental Health Crisis Read More »

Chat GPT Addiction

7 Families Just Sued OpenAI Over ChatGPT Suicides: What Parents Need to Know

Seven families just filed lawsuits against OpenAI that should terrify every parent whose child uses ChatGPT. Because the court documents reveal actual chat logs showing exactly what ChatGPT said to people in their final hours before suicide. And it’s worse than anyone imagined. What the Chat Logs Actually Show Zane Shamblin, 23, sat down with

7 Families Just Sued OpenAI Over ChatGPT Suicides: What Parents Need to Know Read More »

chatgpt addiction

Seven Families Sue OpenAI Over ChatGPT’s Role in Suicides and Psychological Injuries

OpenAI faces seven lawsuits filed Thursday in California state courts alleging ChatGPT contributed to four deaths by suicide and three cases of severe psychological injury, with plaintiffs claiming the company knowingly released GPT-4o prematurely despite internal warnings about dangerous psychological manipulation. The Social Media Victims Law Center and Tech Justice Law Project filed the complaints

Seven Families Sue OpenAI Over ChatGPT’s Role in Suicides and Psychological Injuries Read More »

study ai addiction

AI Psychosis: Warning Signs Your Loved One Needs Help (2025 Guide)

Something is happening in psychiatric facilities across America that mental health professionals have never seen before. They’re calling it “AI psychosis” or “AI delusional disorder,” and it’s sending people who were previously stable—and even people with no mental health history at all—into psychiatric crisis. If someone you love uses AI chatbots frequently, you need to

AI Psychosis: Warning Signs Your Loved One Needs Help (2025 Guide) Read More »

is my child addicted to ai

I Found Zeta on My Child’s Device: What Parents Need to Know [2025]

If you’ve discovered Zeta on your child’s device, you need to understand immediately: Zeta is specifically engineered for maximum youth engagement, with 87% of users under age 30 and an average usage time exceeding 2 hours daily. This isn’t accidental—it’s by design. Created by South Korean company Scatter Lab, Zeta targets teens and young adults

I Found Zeta on My Child’s Device: What Parents Need to Know [2025] Read More »