Tech News
← Back to articles

Apple Sued Over Allegations of CSAM on iCloud

read original related products more articles

Apple is facing a lawsuit filed Thursday by West Virginia Attorney General JB McCuskey over allegations that iCloud is being used to store and distribute child sexual abuse material online. McCuskey alleges that Apple knew about this "for years" and "chose to do nothing about it."

The lawsuit contains alleged iMessage screenshots between Apple executives Eric Friedman and Herve Sibert acknowledging the storage and distribution of CSAM on iCloud back in February 2020.

"In an iMessage conversation about whether Apple might be putting too much emphasis on privacy and not enough on trust and child safety, Friedman boasted that iCloud is 'the greatest platform for distributing child porn' and that Apple has 'chosen to not know in enough places where we really cannot say'," the lawsuit alleges.

"In the same conversation," it continues, "Friedman referred to a New York Times article about CSAM detection and revealed that he suspects Apple is underreporting the size of the CSAM issue it has on its products."

The lawsuit points to the number of reports of detected CSAM made to the National Center for Missing and Exploited Children in 2023 by Apple (267) compared to Google (1.47 million) and Meta (30.6 million).

The lawsuit alleges Apple failed to implement CSAM detection tools, including a proprietary scanning tool it had been working on. In 2021, Apple kicked off an initiative to scan images stored on iCloud for CSAM, which it had abandoned by the following year.

The role of end-to-end encryption

It also points to Apple's security offering Advanced Data Protection, which became available in December 2022 on iCloud and allows end-to-end encryption of photos and videos on the cloud-storage platform. The lawsuit alleges that end-to-end encryption is "a barrier to law enforcement, including the identification and prosecution of CSAM offenders and abusers."

"Preserving the privacy of child predators is absolutely inexcusable," McCuskey said in a statement Thursday. "Since Apple has so far refused to police themselves and do the morally right thing, I am filing this lawsuit to demand Apple follow the law, report these images and stop re-victimizing children by allowing these images to be stored and shared."

Apple told CNET that "safety and privacy" is at the center of its decisions, especially for children.

... continue reading