HeadlinesBriefing favicon HeadlinesBriefing.com

App Store Hosts Nonconsensual AI Undressing Apps, Report Claims

9to5Mac •
×

A new report from the Tech Transparency Project reveals that the App Store and Google Play Store are rife with apps that generate nonconsensual AI images. These apps, often using terms like "nudify," allow users to create images of people without their consent. The report highlights the ease with which these apps can be found and the revenue they generate.

This issue comes as generative AI tools have made creating such images easier. These apps have been downloaded over 705 million times and generated $117 million in revenue. The report suggests that Apple and Google are not effectively policing their platforms. Both stores have policies against sexually suggestive content, yet these apps persist.

The TTP found that these apps fall into two categories: those that generate images from user prompts and "face swap" apps. The report also notes that simple searches for terms like "nudify" or "undress" surface these apps. Apple and Google take a cut of the revenue, directly profiting from these apps.

What happens next is that scrutiny is likely to increase on Apple and Google to take action. The report's findings will likely fuel calls for stricter app review processes and greater enforcement of existing policies. This situation highlights the ongoing challenges of regulating AI-generated content on popular platforms.