Apple

Apple Sued for Dropping CSAM Detection Features in 2022

A victim of childhood sexual abuse sues Apple over its 2022 decision to drop CSAM detection features previously planned to scan iCloud-stored images for child sex abuse material.

A victim of childhood sexual abuse is suing Apple over its 2022 decision to drop a previously-announced plan to scan images stored in iCloud for child sexual abuse material (CSAM). The woman, using a pseudonym, says Apple broke its promise to protect victims like her when it eliminated the CSAM-scanning feature. Her lawsuit demands changes to Apple practices and potential compensation for up to 2,680 other eligible victims.

#Apple #lawsuit #CSAM detection

Latest News

xBloom

xBloom Studio: The Coffee Maker That Puts Science in Your Cup

5 months ago

Motorola

Moto Watch Fit Priced at $200: Is It Worth the Cost for Fitness Enthusiasts?

5 months ago

iOS

iOS 18's Subtle but Significant Privacy Boost: Granular Contact Sharing Control

5 months ago

Google

Walmart Unveils Onn 4K Plus: The Affordable $30 Google TV Streaming Device

5 months ago

Apple

Judge Forces Apple to Comply: Epic Games' Fortnite Returns Hinge on Court Order

5 months ago

OnePlus

OnePlus Unveils the ‘Plus Key’: Is It Just an iPhone Knockoff or Something Revolutionary?

5 months ago