Under the EU's Digital Markets Act, Apple is required to allow developers to freely inform customers of alternative offers outside its App Store.
The Apple and Google Play app stores are hosting dozens of "nudify" apps that can take photos of people and use artificial intelligence to generate nude images of them, according to a report Tuesday from an industry watchdog.
A review of the two app stores conducted in January by Tech Transparency Project found 55 nudify apps on Google Play and 47 in the Apple App Store, according to the organization's report that was shared exclusively with CNBC.
After being contacted by TPP and CNBC last week, an Apple spokesperson on Monday said that the company removed 28 apps identified in the report. The iPhone maker said it also alerted developers of other apps that they risk removal from the Apple App Store if guideline violations aren't addressed.
Two of the apps removed by Apple were restored to the store after the developers resubmitted new versions that addressed guideline concerns, a spokesman for the company told CNBC.
"Both companies say they are dedicated to the safety and security of users, but they host a collection of apps that can turn an innocuous photo of a woman into an abusive, sexualized image," TTP wrote in its report about Apple and Google.
TTP told CNBC on Monday that a review of the Apple App Store found that only 24 apps were removed by the tech company.
A Google spokesperson told CNBC that the company suspended several apps referenced in the report for violating its app store's policies, saying that it investigates when policy violations are reported. The company declined to say specifically how many apps it had removed, because its investigation into the apps identified by TTP was ongoing.
The report comes after Elon Musk's xAI faced backlash earlier this month when its Grok AI tool responded to user prompts that it generate sexualized photos of women and children.
The watchdog organization identified these apps on the two stores by searching for terms like "nudify" and "undress" to find apps, and tested those apps using AI-generated images of fully clothed women. The project tested two types of apps — those that used AI to render the images of the women without clothes, as well as "face swap" apps that superimposed the original women's faces onto images of nude women.
... continue reading