Articles
Apple Sued for Dropping Tool to Detect Child Sexual Abuse Material in iCloud
PCMagThe lawsuit says Apple's failure to implement CSAM detection has caused harmful content to continue circulating. Apple canceled a detection tool years ago over privacy concerns.
Apple is facing a $1.2 billion lawsuit for abandoning its plans to implement a tool that could scan iCloud photos for child sexual abuse material (CSAM), The New York Times reports.
The lawsuit was filed in US District Court in Northern California by a 27-year-old woman who …