Exposing the Dark Side of App Stores: How Apple and Google are Failing to Protect Users
A recent investigation by the Tech Transparency Project (TTP) has uncovered a disturbing trend in the world of mobile apps. It appears that both Apple and Google are not only hosting “nudify” apps on their platforms but are also actively promoting them through their search and advertising systems. These apps, which use AI to digitally strip clothing from photos of real people, can also generate pornographic videos or create sexually explicit chatbots using someone’s likeness.
The TTP found that 31 of the apps they discovered were rated suitable for minors, which raises serious concerns about the safety and well-being of young users. The investigation also revealed that both Apple and Google are running paid ads for these nudify apps, with some of them being openly pornographic. This is particularly troubling, given that Apple controls all advertising in its App Store and has a stated policy against ads promoting adult content.
Tech Transparency Project
How Nudify Apps are Slipping Through the Cracks
The TTP ran searches using terms like “nudify,” “undress,” “deepfake,” and “AI NSFW” on both app stores. About 40% of the top 10 results for each term returned apps that could render women nude or scantily clad. The autocomplete feature on both platforms also made it easier for users to stumble upon these apps, with suggestions like “image to video ai nsfw” leading to more nudifying apps in the top results.
The investigation also found that the apps identified across both stores have been downloaded 483 million times and earned over $122 million in lifetime revenue. Apple and Google take a cut of that through paid subscriptions and in-app purchases, which may explain why enforcement has been lax. After TTP and Bloomberg flagged these apps, Apple removed 15 of them, and Google suspended several others, but both companies declined to explain how these apps had passed review or why age ratings allowed minors to download them.

Tech Transparency Project
The Consequences of Inaction
The UK government has begun proposing and enacting laws against explicit deepfakes, and the US recently recorded its first criminal conviction under one such law. Pressure on Apple and Google to act more decisively is only likely to grow. Apple’s own enforcement record is already under scrutiny, with a letter obtained by NBC News revealing that Apple privately threatened to pull Grok from the App Store in January over sexualized deepfakes.
As reports like this one continue to pile up, both companies are running out of room to look the other way. The fact that these apps have been downloaded hundreds of millions of times and have earned millions of dollars in revenue raises serious questions about the effectiveness of Apple and Google’s content moderation policies. It’s time for these tech giants to take responsibility for the content they host and to prioritize the safety and well-being of their users.




