• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Gadgets

Apple sued for a billion dollars over alleged failure to block child sex abuse materials

December 10, 2024
Share on FacebookShare on Twitter

Apple is once again facing a billion dollar lawsuit, as thousands of victims come out against the company for its alleged complicity in spreading child sex abuse materials (CSAM).

In a lawsuit filed Dec. 7, the tech giant is accused of reneging on mandatory reporting duties — which require U.S.-based tech companies to report instances of CSAM to the National Center for Missing & Exploited Children (NCMEC) — and allowing CSAM to proliferate. In failing to institute promised safety mechanisms, the lawsuit claims, Apple has sold “defective products” to specific classes of customers (CSAM victims).

Some of the plaintiffs argue they have been continuously re-traumatized by the spread of content long after they were children, as Apple has chosen to focus on preventing new cases of CSAM and the grooming of young users.

“Thousands of brave survivors are coming forward to demand accountability from one of the most successful technology companies on the planet. Apple has not only rejected helping these victims, it has advertised the fact that it does not detect child sex abuse material on its platform or devices thereby exponentially increasing the ongoing harm caused to these victims,” wrote lawyer Margaret E. Mabie.

Mashable Light Speed

SEE ALSO:

‘Abysmal’ working conditions, exploitation of webcam models exposed

The company has retained tight control over its iCloud product and user libraries as part of its wider privacy promises. In 2022, Apple scrapped its plans for a controversial tool that would automatically scan and flag iCloud photo libraries for abusive or problematic material, including CSAM. The company cited growing concern over user privacy and mass surveillance by Big Tech in its choice to no longer introduce the scanning feature, and Apple’s choice was widely supported by privacy groups and activists around the world. But the new lawsuit argues that the tech giant merely used this cybersecurity defense to skirt its reporting duties.

“Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk,” wrote Apple spokesperson Fred Sainz in response to the lawsuit. “We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts.”

Tech companies have struggled to control the spread of abusive material online. A 2024 report by UK watchdog National Society for the Prevention of Cruelty to Children (NSPCC) accused Apple of vastly underreporting the amount of CSAM shared across its products, with the company submitting just 267 worldwide reports of CSAM to NCMEC in 2023. Competitors Google and Meta reported more than 1 million and 30 million cases, respectively. Meanwhile, growing concern over the rise of digitally-altered or synthetic CSAM has complicated the regulatory landscape, leaving tech giants and social media platforms racing to catch up.

While Apple faces a potential billion-dollar lawsuit should the suit move to and be favored by a jury, the decision has even wider repercussions for the industry and privacy efforts at large. The court could decide to force Apple into reviving its photo library scanning tool or implement other industry features to remove abusive content, paving a more direct path toward government surveillance and wielding another blow to Section 230 protections.

Next Post

Babylon 5 Complete Series & Animated Film Get Limited-Time Discounts At Amazon

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • How to share your real-time location in Google Messages on Android
  • Today’s Hurdle hints and answers for March 22, 2026
  • Moon phase today explained: What the Moon will look like on March 22, 2026
  • Android 17’s Automatic SIM lock protection is nearly here
  • NYT Connections hints and answers for March 22. Tips to solve ‘Connections’ #1015.

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously