• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Mobile

Apple to pay $95 million settlement for Siri listening to your private conversations

January 2, 2025
Share on FacebookShare on Twitter

Apple has agreed to pay $95 million in a class-action settlement alleging that private Siri conversations were inadvertently recorded and listened to by third-party contractors.

If U.S. District Judge Jeffrey White approves the proposed settlement, filed on Tuesday in Oakland, CA, federal court, users impacted will receive up to $20 per Apple device with Siri, such as the iPhone and Apple Watch.

SEE ALSO:

New evidence claims Google, Microsoft, Meta, and Amazon could be listening to you on your devices

The lawsuit centers around customer complaints that Siri was unintentionally activated and a 2019 report from a whistleblower via The Guardian that Apple contractors heard voice recordings while testing for quality control. This included “confidential medical information, drug deals, and recordings of couples having sex,” according to the investigation. Siri is only supposed to activate upon hearing the wake word “hey Siri,” but there were reported instances of Siri being triggered by other things — such as the sound of a zipper, an Apple Watch being raised in a certain way, and hearing a voice.

Mashable Light Speed

Apple users claimed private conversations were recorded and then shared with third-party advertisers. They would then see ads for products mentioned in certain conversations and even a surgical treatment after discussing it with their doctor. Apple subsequently issued a formal apology and said it would no longer save voice recordings.

SEE ALSO:

‘LLM Siri’ aims to rival ChatGPT — but don’t expect it until iOS 19

The lawsuit spans the time period from Sept. 17, 2014, to Dec. 31, 2024. In order for Apple users to claim their part of the settlement, they must submit a claim for up to five Apple devices with Siri (iPhone, iPad, Apple Watch, MacBook, iMac, HomePod, iPod touch, or Apple TV) and swear under oath that they inadvertently activated Siri “during a conversation intended to be confidential or private,” said the settlement proposal.

Apple isn’t the only company in trouble for privacy violations incurred by voice assistants. Google is in the midst of a similar class-action lawsuit regarding Google Assistant being triggered without its wake words.

Next Post

Hot deal: Get an Amazon Echo Dot and Echo Buds for just $44.99

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • NYT Connections Sports Edition hints and answers for February 8: Tips to solve Connections #503
  • I really want to love Google’s Notification cooldown, but it needs some work
  • NYT Pips hints, answers for February 8, 2026
  • NYT Connections hints and answers for February 8, Tips to solve ‘Connections’ #973.
  • NYT Strands hints, answers for February 8, 2026

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously