HeadlinesBriefing favicon HeadlinesBriefing.com

Women sue AI ModelForge over nonconsensual deepfake porn

Ars Technica •
×

Arizona woman identified only as MG sued three Phoenix men who run the AI ModelForge platform, alleging they turned her Instagram photos into nonconsensual deepfake porn. The defendants allegedly scraped images of women with under 50,000 followers, fed them into CreatorCore, and sold the resulting nude videos on subscription sites. MG discovered the fakes after a follower sent her a link to reels that mimicked her look.

The complaint says the trio charged $24.95 a month on Whop for courses that taught other men how to replicate the process, providing “Blueprints” for image scraping and clothing removal. Plaintiffs claim the scheme generated more than $50,000 in a single month and amassed millions of views, creating a market of AI copies that prey on ordinary social‑media users.

Federal law such as the 2025 Take It Down Act criminalizes nonconsensual AI porn, but it won’t take effect until May 2026, leaving victims with limited recourse. Arizona legislators have introduced bills demanding automated detection, yet platforms like Instagram and TikTok claim the offending accounts are under review. The lawsuit spotlights a growing profit model that weaponizes AI against everyday women.