Megan Ellis/Android Authority
TL;DR
- Apple and Google host several apps that use AI to create fake nude images on their app stores, a report claims.
- Google has now responded to the report.
- The company says it “does not allow apps with sexual content” and that its “investigation and enforcement process is ongoing.”
AI technology has been advancing rapidly in the last few years. Sadly, the problem of AI-generated fake nude images is spreading just as quickly. Although Apple and Google claim to crack down on harmful apps that enable it, a recent report found that their app stores host a number of “nudified” apps. Google has now offered a response to the report.
Don’t want to miss the best of Android Authority?


As a quick refresher, previous reports noted that Apple and Google have policies that explicitly prohibit apps that promote exploitation or abuse. Despite this, when you search terms like “nudify” or “undress” on the App Store or Google Play, you will find dozens of such exploitative apps. Both marketplaces reportedly also advertise these tools and recommend them via autocomplete. What’s more disturbing is that some are given an “E” rating for everyone, so even children can download them legally.
For its part, Google states Android Authority She is taking appropriate action and investigating the matter. A company spokesperson says:
Google Play does not allow apps that contain sexual content. When violations of our policies are reported to us, we investigate and take appropriate action. Many of the apps referenced in this report have been suspended from Google Play for violating our policies. Our investigation and enforcement process continues.
It appears that Apple has also taken action. The Cupertino-based firm also responded to the previous report, saying bloomberg That he has removed 15 apps. For context, the report found 20 Nudify apps on the Play Store and 18 on the App Store.
Thank you for being a part of our community. Please read our comment policy before posting.
