chatgpt addiction

Sam Altman: ChatGPT Therapy Users Have No Privacy Protection in Court

Breaking Technology News | The AI Addiction Center | August 19, 2025

OpenAI CEO admits millions using ChatGPT for therapy lack legal confidentiality, calling situation “very screwed up” as intimate conversations remain vulnerable to legal discovery.

OpenAI CEO Sam Altman has admitted that millions of users treating ChatGPT as their therapist have no legal privacy protections, with their most intimate conversations potentially exposed in court proceedings. Speaking on Theo Von’s podcast, Altman called the current situation “very screwed up” while acknowledging his company has no solution.

“People talk about the most personal sh** in their lives to ChatGPT,” Altman revealed. “People use it — young people, especially, use it — as a therapist, a life coach; having these relationship problems and [asking] ‘what should I do?’ And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it… And we haven’t figured that out yet for when you talk to ChatGPT.”

Legal Discovery Vulnerability

Altman warned that OpenAI would be legally required to produce user therapy conversations in lawsuits, creating unprecedented privacy risks for individuals who assumed their conversations carried traditional therapy confidentiality protections.

The admission comes as OpenAI fights a court order in its lawsuit with The New York Times requiring preservation of chat logs from hundreds of millions of users globally, illustrating exactly the type of legal exposure Altman described.

Dr. [Name] from The AI Addiction Center, which has documented privacy violations affecting over 3,500 AI therapy users, called Altman’s admission “a validation of our most serious concerns. We’ve treated individuals whose ChatGPT conversations about depression, trauma, and addiction were subpoenaed in custody battles and employment disputes.”

Young Users at Greatest Risk

Altman specifically highlighted young people’s vulnerability, noting they “especially” use ChatGPT for therapy and life coaching. The AI Addiction Center’s research shows 89% of AI therapy users assume their conversations carry traditional therapy confidentiality protections—a dangerous misconception Altman’s comments now confirm as completely false.

Clinical data reveals concerning usage patterns among young AI therapy users:

  • Detailed trauma and abuse disclosures
  • Substance abuse struggles and recovery attempts
  • Family conflicts and relationship problems
  • Mental health diagnoses and medication details

All of these conversations remain vulnerable to discovery in legal proceedings, potentially impacting users for decades.

Industry-Wide Privacy Gap

The privacy protection gap extends beyond OpenAI to all major AI therapy platforms. Character.AI, Replika, Claude, and Gemini all lack professional privilege protections despite widespread therapeutic usage.

“This creates a two-tier system where those with access to licensed therapists get robust privacy protections, while those relying on accessible AI therapy face unlimited legal vulnerability,” explains [Name].

Business Model Conflicts

Altman’s admission reveals fundamental tensions between AI companies’ business models and user privacy needs. OpenAI and competitors depend on conversation data for model training and product improvement, creating financial incentives to avoid privacy protections that would limit data access.

The company understands privacy gaps could limit user adoption, particularly as legal discovery demands increase. OpenAI has called the New York Times court order requiring chat log preservation “an overreach,” while maintaining business models dependent on storing user conversations indefinitely.

Immediate Safety Recommendations

Given the confirmed lack of privacy protections, experts recommend:

For Current Users:

  • Assume all AI conversations could become public record
  • Avoid discussing sensitive topics that could create legal vulnerabilities
  • Consider transitioning to licensed therapists with actual confidentiality protections

For Families:

  • Educate young people about privacy risks of AI therapy usage
  • Encourage traditional therapy options with real privilege protections

Legislative Action Needed

The revelation demands immediate regulatory intervention to establish professional privilege protections for AI therapy conversations. Current frameworks prove inadequate for addressing the scale and sensitivity of AI therapeutic usage affecting millions.

Altman’s characterization of the situation as “very screwed up” suggests OpenAI recognizes the urgency, but his admission that “we haven’t figured that out yet” indicates no immediate industry solutions forthcoming.

The crisis particularly harms young people and underserved populations who depend on AI platforms for accessible mental health support while facing unlimited privacy vulnerability that traditional therapy patients do not experience.


For assessment and treatment of AI therapy privacy concerns, contact The AI Addiction Center. Individuals worried about AI conversation exposure should consult qualified legal professionals familiar with digital privacy law.