Amazon's AI services division filed 1.1 million reports of suspected online child exploitation in 2025 to an advocacy group. But because those reports lacked essential information, there were zero cases where law enforcement was able to take action. A new inquiry opened in the Senate aims to ensure that never happens again.
Sen. Chuck Grassley, an Iowa Republican who chairs the Senate Judiciary Committee, this week opened an inquiry into eight big tech companies over their handling of mandatory reporting of online child exploitation. It's the latest step in a growing movement questioning whether tech companies can be trusted to keep their youngest users safe while online.
Electronic service providers are required by law to report incidents of child sex exploitation to the CyberTipline run by the National Center for Missing and Exploited Children. In 2025, over 17 million reports of suspected online child sex exploitation were filed. But these reports may not have the necessary information to prompt action in the real world.
"I'm alarmed by what I've read," Grassley said. "Based on information provided to my office, I am concerned that some companies have not provided NCMEC and law enforcement with sufficient data needed to protect kids and prosecute suspected predators."
Grassley sent requests for more information to several major tech companies: Meta, TikTok, Roblox, Snap, Amazon AI Services, xAI, Grindr and Discord. These eight companies make up 81% of all child exploitation reports submitted to NCMEC. Notably absent from the inquiry was Google, owner of YouTube.
A Meta spokesperson told CNET the company "works tirelessly" to protect kids from this "horrific crime," stating: "We're committed to constant improvement and appreciate feedback, which has already led us to make some improvements, as NCMEC has acknowledged. We will continue making refinements to improve our reporting process."
Grindr, Discord and Roblox made similar comments, saying they plan to work with the Senate and NCMEC on these issues. Grindr added that its dating site is only for adults, aged 18 and up. The other tech companies did not immediately respond to requests for comment.
The Iowa Republican's inquiry follows reports from NCMEC in 2025 that tech companies were failing to provide essential location data in their reports and failing to disclose their use of child sex abuse material in AI data training. This is especially concerning given previous incidents of AI being used to create nonconsensual intimate imagery, including child sex abuse material.
Child exploitation online is a growing issue. In 2025, Meta alone filed nearly 11 million reports, 1.2 million of which dealt with suspected child trafficking. Meta owns the popular platforms Facebook, Instagram and WhatsApp. NCMEC said in 2025 that Meta and xAI had improved their reporting, but it was still lacking.
"Many ESPs regularly tout the number of reports they submit to the CyberTipline, but fail to disclose that millions of reports lack basic information," NCMEC wrote to Grassley in 2025. "This leaves children unprotected online, subjects survivors to revictimization, enables sexual offenders to remain freely online and wastes valuable and limited law enforcement resources."
... continue reading