Sam Altman warns there’s no legal confidentiality when using ChatGPT as a therapist

by Alan North
0 comments


ChatGPT users may want to think twice before turning to their AI app for therapy or other kinds of emotional support. According to OpenAI CEO Sam Altman, the AI industry hasn’t yet figured out how to protect user privacy when it comes to these more sensitive conversations, because there’s no doctor-patient confidentiality when your doc is an AI.

The exec made these comments on a recent episode of Theo Von’s podcast, This Past Weekend w/ Theo Von.

In response to a question about how AI works with today’s legal system, Altman said one of the problems of not yet having a legal or policy framework for AI is that there’s no legal confidentiality for users’ conversations.

“People talk about the most personal sh** in their lives to ChatGPT,” Altman said. “People use it — young people, especially, use it — as a therapist, a life coach; having these relationship problems and [asking] ‘what should I do?’ And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT.”

This could create a privacy concern for users in the case of a lawsuit, Altman added, because OpenAI would be legally required to produce those conversations today.

“I think that’s very screwed up. I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever — and no one had to think about that even a year ago,” Altman said.

The company understands that the lack of privacy could be a blocker to broader user adoption. In addition to AI’s demand for so much online data during the training period, it’s being asked to produce data from users’ chats in some legal contexts. Already, OpenAI has been fighting a court order in its lawsuit with The New York Times, which would require it to save the chats of hundreds of millions of ChatGPT users globally, excluding those from ChatGPT Enterprise customers.

Techcrunch event

San Francisco
|
October 27-29, 2025

In a statement on its website, OpenAI said it’s appealing this order, which it called “an overreach.” If the court could override OpenAI’s own decisions around data privacy, it could open the company up to further demand for legal discovery or law enforcement purposes. Today’s tech companies are regularly subpoenaed for user data in order to aid in criminal prosecutions. But in more recent years, there have been additional concerns about digital data as laws began limiting access to previously established freedoms, like a woman’s right to choose.

When the Supreme Court overturned Roe v. Wade, for example, customers began switching to more private period-tracking apps or to Apple Health, which encrypted their records.

Altman asked the podcast host about his own ChatGPT usage, as well, given that Von said he didn’t talk to the AI chatbot much due to his own privacy concerns.

“I think it makes sense … to really want the privacy clarity before you use [ChatGPT] a lot — like the legal clarity,” Altman said.



Source link

Related Posts

Leave a Comment