Think your ChatGPT therapy sessions are private? Think again.

Jul 25, 2025 - 22:00
 0  0
Think your ChatGPT therapy sessions are private? Think again.

If you’ve been confessing your deepest secrets to an AI chatbot, it might be time to reevaluate. 

With more people turning to AI for instant life coaching, tools like ChatGPT are sucking up massive amounts of personal information on their users. While that data stays private under ideal circumstances, it could be dredged up in court – a scenario that OpenAI CEO Sam Altman warned users in an appearance on Theo Von’s popular podcast this week.

“One example that we’ve been thinking about a lot… people talk about the most personal shit in their lives to ChatGPT,” Altman said. “Young people especially, use it as a therapist, as a life coach, ‘I’m having these relationship problems, what should I do?’ And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it, there’s doctor patient confidentiality, there’s legal confidentiality.”

Altman says that as a society we “haven’t figured that out yet” for ChatGPT. Altman called for a policy framework for AI, though in reality OpenAI and its peers have lobbied for a regulatory light touch.

“If you go talk to ChatGPT about your most sensitive stuff and then there’s a lawsuit or whatever, we could be required to produce that, and I think that’s very screwed up,” Altman told Von, arguing that AI conversations should be treated with the same level of privacy as a chat with a therapist. 

While interactions with doctors and therapists are protected by federal privacy laws in the U.S., exceptions exist for instances in which someone is a threat to themselves or others. And even with those strong privacy protections, relevant medical information can be surfaced by court order, subpoena or a warrant. 

Altman’s argument seems to be that from a regulatory perspective, ChatGPT shares more in common with licensed, trained specialists than it does with a search engine. “I think we should have the same concept of privacy for your conversations with AI that we do with a therapist,” he said.

Altman also expressed concerns about how AI will adversely impact mental health, even as people seek its advice in lieu of the real thing.

“Another thing I’m afraid of… is just what this is going to mean for users’ mental health. There’s a lot of people that talk to ChatGPT all day long,” Altman said. “There are these new AI companions that people talk to like they would a girlfriend or boyfriend.

“I don’t think we know yet the ways in which [AI] is going to have those negative impacts, but I feel for sure it’s going to have some, and we’ll have to, I hope, we can learn to mitigate it quickly.”

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0