Communication Safety
Search documents
Apple Sued Over Allegations of CSAM on iCloud
CNET· 2026-02-19 21:43
Core Viewpoint - Apple is facing a lawsuit from West Virginia Attorney General JB McCuskey over allegations that iCloud is being used to store and distribute child sexual abuse material (CSAM), with claims that Apple has been aware of this issue for years and has not taken appropriate action [1][5]. Group 1: Allegations and Evidence - The lawsuit includes alleged iMessage conversations between Apple executives acknowledging the presence of CSAM on iCloud as early as February 2020 [1][2]. - In these conversations, one executive reportedly stated that iCloud is "the greatest platform for distributing child porn" and suggested that Apple has chosen to remain ignorant of the extent of the CSAM issue [2]. - The number of CSAM reports made by Apple to the National Center for Missing and Exploited Children in 2023 was only 267, significantly lower than Google's 1.47 million and Meta's 30.6 million [3]. Group 2: CSAM Detection Tools and Encryption - The lawsuit claims that Apple failed to implement CSAM detection tools, including a proprietary scanning tool that was abandoned after an initial initiative in 2021 [4]. - Apple's Advanced Data Protection, which allows end-to-end encryption for photos and videos on iCloud, is alleged to hinder law enforcement efforts to identify and prosecute CSAM offenders [5]. - Apple has stated that its safety and privacy measures are designed with user protection in mind, including features like Communication Safety, which aims to protect children from CSAM content [6]. Group 3: Industry and Privacy Debate - The ongoing debate surrounding end-to-end encryption highlights the tension between privacy and security, particularly in relation to law enforcement and cybercrime [7]. - Privacy advocates have praised the introduction of encryption to iCloud, arguing that constant scanning for CSAM could lead to false positives and unwarranted investigations [8]. - The Electronic Frontier Foundation emphasized that blocking end-to-end encryption would undermine online security and privacy for all users, especially young people [9]. Group 4: Legal Context - The lawsuit was filed in the Circuit Court of Mason County, West Virginia, on February 19, 2024, following a class-action lawsuit in Northern California involving 2,680 plaintiffs who allege that Apple's inaction on CSAM scanning constitutes complicity in its distribution [10][11].
West Virginia says it has sued Apple over iCloud's alleged role in distribution of child sex abuse material
Reuters· 2026-02-19 15:03
 West Virginia sues Apple over iCloud's alleged role in distribution of child sex abuse material | ReutersSkip to main content[Exclusive news, data and analytics for financial market professionalsLearn more aboutRefinitiv]A truck from the child advocacy organization Heat Initiative calling on Apple to do more to police child sex abuse material on iCloud, is parked outside the Apple store as people line up to... [Purchase Licensing Rights, opens new tab] Read more- Companies- Summary- Apple filed far fewer ...