• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Internet

ChatGPT Health lets you connect medical records to an AI that makes things up

January 8, 2026
Share on FacebookShare on Twitter

But despite OpenAI’s talk of supporting health goals, the company’s terms of service directly state that ChatGPT and other OpenAI services “are not intended for use in the diagnosis or treatment of any health condition.”

It appears that policy is not changing with ChatGPT Health. OpenAI writes in its announcement, “Health is designed to support, not replace, medical care. It is not intended for diagnosis or treatment. Instead, it helps you navigate everyday questions and understand patterns over time—not just moments of illness—so you can feel more informed and prepared for important medical conversations.”

A cautionary tale

The SFGate report on Sam Nelson’s death illustrates why maintaining that disclaimer legally matters. According to chat logs reviewed by the publication, Nelson first asked ChatGPT about recreational drug dosing in November 2023. The AI assistant initially refused and directed him to health care professionals. But over 18 months of conversations, ChatGPT’s responses reportedly shifted. Eventually, the chatbot told him things like “Hell yes—let’s go full trippy mode” and recommended he double his cough syrup intake. His mother found him dead from an overdose the day after he began addiction treatment.

While Nelson’s case did not involve the analysis of doctor-sanctioned health care instructions like the type ChatGPT Health will link to, his case is not unique, as many people have been misled by chatbots that provide inaccurate information or encourage dangerous behavior, as we have covered in the past.

That’s because AI language models can easily confabulate, generating plausible but false information in a way that makes it difficult for some users to distinguish fact from fiction. The AI models that services like ChatGPT use statistical relationships in training data (like the text from books, YouTube transcripts, and websites) to produce plausible responses rather than necessarily accurate ones. Moreover, ChatGPT’s outputs can vary widely depending on who is using the chatbot and what has previously taken place in the user’s chat history (including notes about previous chats).

Next Post

Best smartwatch deal: Get the Nothing CMF Watch 3 Pro for $37.40 off at Amazon

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • Seth Meyers has strong words for the Trump admin’s response to ICE shooting
  • Acer still won’t tell you the Nitro Blaze handheld release date
  • Best Samsung deal: Save $31 on Galaxy Buds FE at Amazon
  • I tried an AI food camera at CES 2026
  • Best PS5 game deal: Grab the Silent Hill 2 remake for under $28

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously