Sam Altman, OpenAI’s CEO, is back at it again. Along with saying recently that ChatGPT’s newest model, GPT-5, is so good it’s actually scary, he’s gone on record about how people are at risk by using ChatGPT (OpenAI’s generative AI) as their personal therapist.One could be forgiven for using ChatGPT as a therapy surrogate, given the disgusting situation regarding healthcare and the rich-folks-only state of mental health treatment. Less forgiving is the “we can’t help it” attitude coming from OpenAI itself.After warning that users inputting personal information into ChatGPT during their “therapy sessions” aren’t protected by the kind of privacy protections that apply to actual therapists, the company is trying to have it both ways: to appear responsible while deflecting the fact that they’re the ones holding the reins.HIPAA doesn’t apply to ChatGPT“People talk about the most personal s**t in their lives to ChatGPT,” Altman said on the July 23 episode of This Past Weekend w/ Theo Von. “People use it—young people, especially, use it—as a therapist, a life coach; having these relationship problems and (asking) ‘what should I do?’“And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT.”In fairness to Altman and OpenAI, neither he nor the company pitches ChatGPT as a therapist. It’s not marketed that way, and the net for “catching” and preventing people from using it for therapy would presumably be a lot harder and more vague than, say, detecting and prohibiting people from using it for porn.HIPAA (Health Insurance Portability and Accountability Act) doesn’t apply to ChatGPT because it deals specifically with the electronic transmission of healthcare information related to insurance. ChatGPT has nothin’ to do with insurance, so it doesn’t apply.Altman, in the episode, pointed out that if OpenAI were pressured by the law to release chat logs that contained personal information the user discussed with ChatGPT, OpenAI would have to hand them over.“I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever—and no one had to think about that even a year ago,” Altman said.Generative AI is in a very immature state. I wouldn’t trust it with my gym locker combination, much less anything related to my mental health.The post Don’t Use ChatGPT as Your Therapist—Unless You Want Your Private Information Leaked appeared first on VICE.