Apple

Apple Sued for Dropping CSAM Detection Features in 2022

A victim of childhood sexual abuse sues Apple over its 2022 decision to drop CSAM detection features previously planned to scan iCloud-stored images for child sex abuse material.

A victim of childhood sexual abuse is suing Apple over its 2022 decision to drop a previously-announced plan to scan images stored in iCloud for child sexual abuse material (CSAM). The woman, using a pseudonym, says Apple broke its promise to protect victims like her when it eliminated the CSAM-scanning feature. Her lawsuit demands changes to Apple practices and potential compensation for up to 2,680 other eligible victims.

#Apple #lawsuit #CSAM detection

Latest News

Apple

MacBook Neo Defies Expectations by Outperforming Enterprise Cloud Servers

1 hour ago

Nvidia

Jensen Huang Defends DLSS 5: AI Enhancements Won't Kill Creative Control

1 hour ago

Warhammer

Warhammer’s New Black Library App Unlocks a Galaxy of Free Stories

1 hour ago

Apple

iPhone 18 Pro: The Next Big Design Revolution Revealed

3 hours ago

Windows

Microsoft Sneaks 10 Essential Upgrades Into New Windows 11 Insider Build

3 hours ago

WhatsApp

WhatsApp for iOS Unveils Sleek New Profile Tab in Latest Update

5 hours ago