Tech News
← Back to articles

Apple sued by West Virginia for alleged failure to stop child sexual abuse material on iCloud, iOS devices

read original more articles

West Virginia's attorney general has filed a consumer protection lawsuit against Apple , alleging that it has failed to prevent child sexual abuse materials from being stored and shared via iOS devices and iCloud services.

John "JB" McCuskey, a Republican, accused Apple of prioritizing privacy branding and its own business interests over child safety, while other big tech companies, including Google , Microsoft , and Dropbox , have been more proactive, using systems like PhotoDNA to combat such material.

PhotoDNA, developed by Microsoft and Dartmouth College in 2009, uses "hashing and matching" to automatically identify and block child sexual abuse material (CSAM) images when they have already been identified as such and reported to authorities.

In 2021, Apple had tested its own CSAM-detection features that could automatically find and remove images of child exploitation, and report those that had been uploaded to iCloud in the U.S. to the National Center for Missing & Exploited Children.

But the company withdrew its plans for the features after privacy advocates who worried that this technology could create a back door for government surveillance, and be tweaked and exploited to censor other kinds of content on iOS devices.

The company's efforts since then have not satisfied a broad array of critics.