Google and Apple Hosted Dozens of 'Nudify' Apps Despite Their Policies
A research report revealed that dozens of AI-powered 'nudify' and face-swapping applications, which violate platform policies, are present on the Apple App Store and Google Play Store.
Deepfake Content Crisis in App Stores
Despite their user protection policies, the app stores of tech giants Google and Apple have hosted AI applications that generate non-consensual nude images. A new study published by the Tech Transparency Project revealed the extent of this situation.
The Study's Striking Findings
According to the report, 55 apps labeled as 'nudify' were detected on the Google Play Store, and 47 were found on the Apple App Store. Thirty-eight of these apps were common to both stores. Researchers found dozens of results by searching the stores with keywords like 'nudify' and 'undress'.
These applications function as AI tools that generate nude images or videos from user text inputs or place a person's face onto another body ('face-swapping'). For example, it was noted that the app DreamFace does not filter explicit text inputs from users and generates nude images of women. It was reported that this app has generated $1 million in revenue and is still available on the Google Play Store.
Platform Policies and Revenue Sharing
Both the Google Play Store and Apple App Store enforce policies that ban apps depicting sexual nudity. However, the report revealed that the stores are being used as distribution channels despite clear violations of these policies. Even more striking is the fact that the platforms take a cut of up to 30% from the in-app purchase revenue generated by these applications.
Similarly, the EU's investigation into platform X reflects growing global concern over sexually explicit deepfakes generated by AI. Furthermore, reports on Grok AI's child safety failures show how critical the issue has become.
Some Apps Removed
After the report was shared with Google and Apple, the companies reportedly took action. Google removed 31 apps, and Apple removed 25 apps from their stores. However, it was noted that some face-swapping apps, like RemakeFace, were still available on both stores at the time the report was published. Such apps can easily be used to non-consensually merge the faces of people known to the user with nude bodies.
Tech Giants' Responsibility and Future Steps
The incident shows that app stores are failing to prevent the spread of AI-powered deepfake apps that can 'undress' people without their consent. The platforms have a responsibility to protect users. Experts emphasize that companies need to tighten their guidelines and focus on a proactive monitoring system, not a reactive one, to detect such apps.
This development is also being evaluated within the context of the increasing legal liabilities of tech giants. It was stated that no response has yet been received from Google and Apple regarding requests for comment on the matter.