• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Sci-Fi

Sam Altman gives really good reason why ChatGPT shouldn’t be your therapist

July 26, 2025
Share on FacebookShare on Twitter

If you need another reason to reconsider using an AI chatbot as your therapist, take it from OpenAI CEO Sam Altman.

In a recent appearance on This Past Weekend with Theo Von, Altman admitted to the comedian that the AI industry hasn’t yet solved the issue of user privacy when it comes to sensitive conversations. Unlike a licensed professional, an AI doesn’t offer doctor-patient confidentiality, and legally, your most personal chats aren’t protected.

SEE ALSO:

How many people use ChatGPT? Hint: OpenAI sees more than 1 billion prompts per day.

“People talk about the most personal shit in their lives to ChatGPT,” Altman said. “Young people especially use it as a therapist, a life coach, asking about relationship problems and what to do.”

But there’s a major difference: “Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it… We haven’t figured that out yet for when you talk to ChatGPT.”

Mashable Light Speed

Without confidentiality protections, anything said in an AI therapy session could be accessed or even subpoenaed in court. The AI industry currently operates in a legal gray area, as the Trump administration continues to navigate the clash between federal and state authority over AI regulation.

While a few federal laws targeting deepfakes exist, how user data from AI chats can be used still depends heavily on state laws. This patchwork of regulations creates uncertainty — especially around privacy — which could hinder broader user adoption. Adding to the concern, AI models already rely heavily on online data for training and, in some cases, are now being asked to produce user chat data in legal proceedings.

In the case of ChatGPT specifically, OpenAI is currently required to retain records of all user conversations — even those users have deleted — due to its ongoing legal battle with The New York Times. The company is challenging the court’s ruling and is actively seeking to have it overturned.

“No one had to think about that even a year ago,” Altman said, calling the situation “very screwed up.”


Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.

Next Post

Why the Galaxy A56 is worth extra

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • Quordle hints and answers for Thursday, March 12 (game #1508)
  • King Dedede & Tank Star Amiibo Preorders Are Live At Amazon, Best Buy, And GameStop
  • Shark AV2501AE AI robot vacuum deal: $299.99 at Amazon
  • One of the best PS1 emulators on Android is losing support
  • Social Security data downloaded by DOGE employee, whistleblower says

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously