Apple

Apple Sued for Dropping CSAM Detection Features in 2022

A victim of childhood sexual abuse sues Apple over its 2022 decision to drop CSAM detection features previously planned to scan iCloud-stored images for child sex abuse material.
By Blip Tech 1 min read

A victim of childhood sexual abuse is suing Apple over its 2022 decision to drop a previously-announced plan to scan images stored in iCloud for child sexual abuse material (CSAM). The woman, using a pseudonym, says Apple broke its promise to protect victims like her when it eliminated the CSAM-scanning feature. Her lawsuit demands changes to Apple practices and potential compensation for up to 2,680 other eligible victims.

#Apple #lawsuit #CSAM detection

Latest News

About Blip Tech

Blip Tech is your go-to source for fast, reliable technology news. We cover everything from the latest Apple and Google announcements to breakthroughs in artificial intelligence, new smartphone releases, computer hardware, and everyday tech tips and how-tos. Our mission is to keep you informed without the fluff — just the news you need, delivered clearly and concisely.