HeadlinesBriefing favicon HeadlinesBriefing.com

Minnesota Cracks Down on AI Fake Nudes with $500K Fines

Ars Technica •
×

Minnesota lawmakers today enacted a ban on AI‑generated nudity, targeting services that let users produce fake nudes of real people. The new law bars free or paid apps from offering such tools, threatening up to $500,000 in fines. Public safety officials say the move protects women, children, and general citizens from a growing digital threat.

The legislation follows a September CNBC exposé that uncovered a Minnesota friend network fabricating dozens of women’s images. Although the perpetrator apologized, no evidence surfaced that he shared them online, leaving existing revenge‑porn statutes ineffective. Victims, led by Molly Kelley, pushed lawmakers to act before the technology could spread beyond state borders in Minnesota in.

Maye Quade, the state’s chief advocate, credited the survivors for driving the bill, noting their courage in sharing personal stories with lawmakers and journalists. She warned that while the state can fine domestic app developers, enforcing penalties against overseas giants like DeepSwap, headquartered in Hong Kong and Dublin, will test legal limits in Minnesota law enforcement.

The law’s strict penalties aim to stop creators from exploiting machine‑learning models to produce non‑existent imagery, a problem that traditional laws miss. Critics argue that a single state’s reach is limited, suggesting federal action could provide a more comprehensive shield. For now, Minnesota sets a precedent that other states may follow in policy debate today.