Child sexual abuse material detection
Search documents
Apple sued by West Virginia for alleged failure to stop child sexual abuse material on iCloud, iOS devices
CNBC· 2026-02-19 15:03
Core Viewpoint - West Virginia's attorney general has filed a consumer protection lawsuit against Apple, accusing the company of failing to prevent child sexual abuse materials from being stored and shared via its iOS devices and iCloud services [1] Group 1: Allegations Against Apple - The attorney general, John "JB" McCuskey, claims that Apple prioritizes its privacy branding and business interests over child safety [1] - Other major tech companies, such as Google, Microsoft, and Dropbox, have been more proactive in combating child sexual abuse materials using systems like PhotoDNA [1] Group 2: Technology and Features - PhotoDNA, developed by Microsoft and Dartmouth College in 2009, uses hashing and matching to automatically identify and block child sexual abuse material images that have been reported to authorities [2] - In 2021, Apple tested its own CSAM-detection features aimed at automatically finding and removing images of child exploitation and reporting them to the National Center for Missing & Exploited Children [2] Group 3: Response to Criticism - Apple withdrew its plans for CSAM-detection features after privacy advocates raised concerns that the technology could create a backdoor for government surveillance and be misused to censor other content on iOS devices [3] - The company's subsequent efforts have not satisfied a wide range of critics [3]