Despite strict policies against sexual and pornographic content, both Apple and Google are reportedly hosting and actively promoting “nudify” and deepfake apps. The Tech Transparency Project (TTP) found that searching for terms like “undress” or “nudify” not only triggered search ads for similar tools but effectively directed users towards them. Many of these apps were given an “E for all” rating, making them easily accessible to minors. TTP identified 18 such apps on Apple’s App Store and 20 on Google Play.
These apps have been downloaded approximately 483 million times and have generated an estimated $122 million in revenue, which the platform benefits from through app store fees. The apps allow users to “nude” real people or swap faces on porn videos. While some apps are marketed with explicit sexual depictions, others appear harmless but contain deep fake tools used to create erotic content of real individuals.
“It’s not that the companies are actually failing to properly review and approve these apps and profit from them,” TTP director Katie Paul told Bloomberg.
Following the report, Apple said it had removed 15 of the identified apps, and Google reported suspending several for policy violations. Google stressed that its investigation into these violations is ongoing, reiterating that sexual content is strictly prohibited. Governments are moving towards closing these flaws as the UK has called for a complete ban on AI deepfake apps targeting children.
In contrast, the US and California are implementing laws against explicit deepfakes, including recent legal action against Elon Musk’s X (formerly Twitter). TTP director Katie Paul said the platforms are not only failing to review these apps, but are “actually directing users to the apps themselves.”
In response, Google reiterated that its policy prohibits sexual content and that it takes action when violations are reported. As a result, several of the apps referenced in this report have been suspended and the company says its investigation is ongoing.
