• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Gadgets

ChatGPT users can now disable storage of chats, and not have them used as training data

April 25, 2023
Share on FacebookShare on Twitter

ChatGPT users now have the option of keeping their chat history private.

In a blog post on Tuesday, OpenAI announced(opens in a new tab) a new setting that allows user to disable their chat history. When disabled, content shared with ChatGPT would not be used to improve the model, and it means conversations are retained for 30 days, then deleted from OpenAI’s system. Previously, the only way to prevent your data from being shared with the model was to opt-out through a form linked in one of OpenAI’s articles(opens in a new tab) about its privacy policy. Now it’s much easier and much more accessible to turn off data sharing.

SEE ALSO:

Users who spot bugs in ChatGPT can now make up to $20,000


Tweet may have been deleted
(opens in a new tab)

The updated privacy setting comes on the heels of a recent privacy breach and the rise of ethical and regulatory concerns about how ChatGPT data is protected. The breach temporarily exposed personal and financial user data to other users. For this reason, Italy banned ChatGPT for inadequate user data protections per Europe’s sweeping General Data Protection Regulation (GDPR) laws. Along the same lines, a complaint was filed to the Federal Trade Commission (FTC) for violating misinformation laws. OpenAI has since pledged its commitment to safety and security, saying(opens in a new tab) it will “continue to enhance safety precautions as our AI systems evolve.”

OpenAI also announced development of a ChatGPT Business subscription, “for professionals who need more control over their data as well as enterprises seeking to manage their end users.” A ChatGPT Business subscription would fall under OpenAI’s API data usage policy which doesn’t share chat content with its model. That’s sure to be a relief for companies worried about their workers using ChatGPT after Samsung employees inadvertently shared confidential code with the chatbot. OpenAI says the business subscription will be rolling out in the coming months.

How to disable ChatGPT chat history

To change your account settings in ChatGPT, click on your account name, then click settings. In the window that pops up, click “Show” if your “Data Controls” are hidden. This will reveal a toggle that says Chat History & Training. Tap the toggle off to disable it.

Toggle off “Chat History & Training” to disable data sharing with the model.
Credit: OpenAI

Next Post

Dead Island 2: all Curveball locations

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • Xbox could make Game Pass cheaper, and even bundle it with Netflix
  • OpenAI releases open-source teen safety tools for AI developers
  • Gemini wants to make sure that your Google TV is the only screen that matters
  • The U.S. router ban: Everything you need to know
  • Best Apple AirTag deal: AirTag 4-pack at new record-low price of $59.99

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously