• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Gadgets

Microsoft might be saving your conversations with Bing Chat

August 16, 2023
Share on FacebookShare on Twitter

Uh-oh — Microsoft might be storing information from your Bing chats.

This is probably totally fine as long as you’ve never chatted about anything you wouldn’t want anyone else reading, or if you thought your Bing chats would be deleted, or if you thought you had more privacy than you actually have.

In its terms of service, Microsoft updated new AI policies. Introduced on July 30 and going into effect on Sept. 30, the policy said: “As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service.”

SEE ALSO:

Microsoft is testing Bing Chat on Chrome and Safari

According to the Register’s reading of a new clause “AI Services” in Microsoft’s terms of service, Microsoft can store your conversations with Bing if you’re not an enterprise user — and we don’t know for how long. 

Microsoft did not immediately respond to a request for comment from Mashable, and a spokesperson from Microsoft declined to comment to the Register about how long it will store user inputs.

“We regularly update our terms of service to better reflect our products and services,” a representative said in a statement to the Register. “Our most recent update to the Microsoft Services Agreement includes the addition of language to reflect artificial intelligence in our services and its appropriate use by customers.”

Beyond storing data, there were four additional policies in the new AI Services clause. Users cannot use the AI service to “discover any underlying components of the models, algorithms, and systems.” Users are not allowed to extract data from the AI services. Users cannot use the AI services to “create, train, or improve (directly or indirectly) any other AI service.” And finally, users are “solely responsible for responding to any third-party claims regarding Your use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during Your use of the AI services).”

So maybe be a bit more careful while using Microsoft Bing chats or switch to Bing Enterprise Chat mode — Microsoft said in July that it doesn’t save those conversations.

Next Post

This witchy indie sets a new benchmark for narrative games

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • Reddit considers adding ID verification to fight AI bots
  • Screamer Review — 100 Billion Dollar Prize | Console Creatures
  • Xbox announces Partner Showcase: When is it, how to watch
  • Total Wireless wants to give you a ‘gorgeous’ Motorola phone for only $50 — here’s the deal
  • When does the Amazon Big Spring Sale end: Check sale dates and times

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously