Tech News
← Back to articles

West Virginia sues Apple for allegedly letting child abuse spread in iCloud

read original related products more articles

is a news writer who covers the streaming wars, consumer tech, crypto, social media, and much more. Previously, she was a writer and editor at MUO.

Posts from this author will be added to your daily email digest and your homepage feed.

West Virginia has filed a lawsuit against Apple, accusing the company of allowing the distribution and storage of child sexual abuse material (CSAM) in iCloud. In a lawsuit filed on Thursday, West Virginia Attorney General JB McCuskey claims that by abandoning a CSAM detection system in favor of end-to-end encryption, iCloud has become a “secure frictionless avenue for the possession, protection, and distribution [of] CSAM,” violating the state’s consumer protection laws.

Apple initially outlined plans to launch a system that checks iCloud photos against a known list of CSAM images in 2021. The move was met with significant backlash from privacy advocates, with some claiming that the company is launching a surveillance system, leading Apple to stop the development of this feature nearly one year later. At the time, Apple’s software head Craig Federighi told The Wall Street Journal that “child sexual abuse can be headed off before it occurs... That’s where we’re putting our energy going forward.”

Now, West Virginia alleges Apple “knowingly and intentionally designed its products with deliberate indifference to the highly preventable harms.” McCuskey believes other states could take legal action against Apple as well, as he told reporters during a press conference that he thinks they’ll “see the leadership that this office has taken” and “join us in this fight.”

The lawsuit claims Apple made 267 CSAM reports to the National Center for Missing & Exploited Children, fewer than the over 1.47 million reports made by Google, and the more than 30.6 million made by Meta. It also cites an internal message between Apple executives, where Apple’s fraud head Eric Friedman allegedly states iCloud is the “greatest platform for distributing child porn.”

Many online platforms, including Google, Reddit, Snap, Meta, and others, use tools like Microsoft’s PhotoDNA or Google’s Content Safety API to detect, remove, and report CSAM in the photos and videos sent through their systems. Apple currently doesn’t offer these capabilities, but it has since rolled out some features focused on child safety, including parental controls that require kids to get permission to text new numbers, as well as a tool that automatically blurs nude images for minors on iMessage in other apps. But McCuskey argues that these safeguards aren’t enough to protect children.

“Apple has knowingly designed a set of tools that dramatically reduces friction for possessing, collecting, safeguarding, and spreading CSAM, all the while engineering an encryption shield that makes it much more likely for bad actors to use Apple to protect their illicit activities,” the lawsuit claims.