Apple

Apple Sued for Dropping CSAM Detection Features in 2022

A victim of childhood sexual abuse sues Apple over its 2022 decision to drop CSAM detection features previously planned to scan iCloud-stored images for child sex abuse material.

A victim of childhood sexual abuse is suing Apple over its 2022 decision to drop a previously-announced plan to scan images stored in iCloud for child sexual abuse material (CSAM). The woman, using a pseudonym, says Apple broke its promise to protect victims like her when it eliminated the CSAM-scanning feature. Her lawsuit demands changes to Apple practices and potential compensation for up to 2,680 other eligible victims.

#Apple #lawsuit #CSAM detection

Latest News

xBloom

xBloom Studio: The Coffee Maker That Puts Science in Your Cup

2 weeks ago

HomeKit

Matter 1.4.1 Update: Daniel Moneta Discusses Future of Smart Home Interoperability on HomeKit Insider Podcast

2 weeks ago

Mac

OWC Unleashes Thunderbolt 5 Docking Station with 11 Ports for M4 MacBook Pro

2 weeks ago

Technology

Nomad Unveils Ultra-Slim 100W Power Adapter for On-the-Go Charging

2 weeks ago

iOS

iOS 19 Set to Debut Bilingual Arabic Keyboard and Virtual Calligraphy Pen for Apple Pencil

2 weeks ago

Apple

Big Tech Lawyers Accused of Encouraging Clients to Break the Law

2 weeks ago