Apple

Apple Sued for Dropping CSAM Detection Features in 2022

A victim of childhood sexual abuse sues Apple over its 2022 decision to drop CSAM detection features previously planned to scan iCloud-stored images for child sex abuse material.

A victim of childhood sexual abuse is suing Apple over its 2022 decision to drop a previously-announced plan to scan images stored in iCloud for child sexual abuse material (CSAM). The woman, using a pseudonym, says Apple broke its promise to protect victims like her when it eliminated the CSAM-scanning feature. Her lawsuit demands changes to Apple practices and potential compensation for up to 2,680 other eligible victims.

#Apple #lawsuit #CSAM detection

Latest News

Apple

iPhone 18 Pro: The Next Big Design Revolution Revealed

1 hour ago

Windows

Microsoft Sneaks 10 Essential Upgrades Into New Windows 11 Insider Build

1 hour ago

WhatsApp

WhatsApp for iOS Unveils Sleek New Profile Tab in Latest Update

3 hours ago

Samsung

Samsung Pulls the Plug on Its $3,000 Tri-Fold Experiment After Only Three Months

3 hours ago

Physics

CERN's Upgraded Smasher Hits Milestone with 80th Particle Discovery

3 hours ago

Samsung

Samsung Admits Privacy Comes at a Cost for Galaxy S26 Ultra’s Stunning Screen

4 hours ago