
Minnesota Moves to Block AI “Deepfake” Pornography Apps
A Minnesota (US) bill seeks to ban apps and websites that use AI technology to create sexually explicit media of people without their consent. Though bills against “deepfake” and revenge pornography already exist across the US, this is the first of its kind to target the providers directly.
Originally introduced in February 2025, the bipartisan bill is currently under review. If passed, it would require operators of so-called “nudification” sites to cease operations in the state, or face civil penalties up to $500,000 “for each unlawful access, download, or use.”
“Nudification” sites and apps allow users to upload photographs – taken from social media or in person – and use generative AI to create realistic, sexually explicit images of the person in minutes.
The legislation gained momentum after Molly Kelley, a Minnesota resident, testified about her experience as a victim. An acquaintance had generated and distributed explicit images of her, using photos taken from her social media accounts.
“My initial shock turned to horror when I learned that the same person targeted about 80, 85 other women, most of whom live in Minnesota, some of whom I know personally, and all of them had connections in some way to the offender,” Kelley said.
States like California, Illinois, New York, and more have bills in place to stop the dissemination of AI deepfakes. But authorities say the widespread knowledge and accessibility of these platforms make enforcement difficult.
Once the images are circulating online, they’re “nearly impossible to remove,” according to Sandi Johnson, a senior policy counsel at the victim advocacy group RAINN (Rape, Abuse & Incest National Network).
US laws addressing deepfake content vary widely, with many states and federal proposals targeting not only dissemination but also creation, distribution, and hosting of the material. Senator Erin Maye Quade, the lead author of the Minnesota bill, argues that the harm begins long before the images are shared.
“It’s not just the dissemination that’s harmful to victims,” Maye Quade said. “It’s the fact that these images exist at all.”
The bill comes shortly after San Francisco filed a landmark lawsuit in August against 16 of the most popular sites of this kind. City Attorney David Chiu wrote in the suit: “Collectively, these sites have been visited over 200 million times just in the first six months of 2024.”