Minnesota passes the nation’s first ban on ‘nudification’ apps

Dan Anderson via Getty Images
The apps are one of the major ways nonconsensual AI deepfakes can be made without any technical expertise — including by kids.
This article was originally published by The 19th.
The Minnesota Senate on Wednesday passed the country’s first ban on “nudification” apps 65-0, addressing one of the main sources of nonconsensual deepfakes. The bill was passed by the state House last week and now just needs the governor’s signature to become law.
Advocates are optimistic Gov. Tim Walz, a Democrat, will sign legislation soon.
This bill was the first attempt in the country to ban websites or apps that promote digital undressing, where photographs of fully clothed people can be uploaded and manipulated with generative AI to appear nude.
These services power nonconsensual intimate imagery and don’t require any technical expertise to use. Google and Apple ban nudification apps from their respective web stores, but research by the Tech Transparency Project showed they remain easily accessible. Investigations from multiple news organizations have found that Meta continues to allow these apps to advertise on their social media platforms Facebook and Instagram.
This blend means the tools are easy for kids to use; the independent media organization Indicator has tracked 23 cases of deepfake abuse targeting school communities in the United States since 2023.
Federal attempts to create a civil right of action for survivors of nonconsensual deepfakes have stalled in Congress. The DEFIANCE Act has yet to make it to the House floor, though it has been passed by the Senate twice. Last year’s Take It Down Act made it a federal crime to disseminate nonconsensual intimate images, regardless of provenance, but does not allow survivors to sue for damages.
Minnesota House File 1606 would allow survivors to sue the owners of nudification apps for damages and empower the state attorney general to collect fines of $500,000 per violation.
The number of nonconsensual deepfakes has risen over the past few years. A mass episode of digital sexual violence kicked off in December when the social media platform X enabled its integrated chatbot Grok to generate images for free. Reporting from The New York Times and the Center for Countering Digital Hate estimates Grok created and posted over 1.8 million sexualized images of women over nine days.
X said it took steps to restrict the creation of nonconsensual deepfakes, but users have been consistently able to bypass guardrails.
Most generative AI products have protections to prevent the creation of sexualized imagery. Lawsuits allege that xAI — the creator of Grok, which marketed the chatbot as “willing to answer spicy questions that are rejected by most other AI systems” — does not take industry-standard precautions.
The Grok incident was when nudifiers went mainstream, but there is a whole economy of apps and websites that monetize the creation of nonconsensual deepfakes.
Deepfakes used to be time- and labor-intensive to create, but now they can be generated with the click of a button. That access is why more and more kids are becoming perpetrators of this kind of abuse, often victimizing their peers.
RAINN, the national nonprofit that runs the National Sexual Assault Hotline, is one of the main forces behind Minnesota’s bill because tech-facilitated abuse is on the rise. Sandi Johnson, senior legislative policy counsel for the group, said there has been an increase in the number of children calling about digital violence over the past five years.
The Centers for Disease Control and Prevention measured the occurrence of tech-facilitated sexual abuse for the first time in its 2023-2024 National Intimate Partner and Sexual Violence Survey. The term is broad, encompassing being the target of nonconsensual deepfakes to being sent unsolicited explicit images. The survey found that in the 12 months prior, 1 in 10 women reported experiencing this kind of abuse; 1 in 3 women said the same when the question applied to their lifetimes.
Molly Kelley is one of those women. Two years ago, she found out a close family friend used a site known for nudification to make nonconsensual deepfakes of her and other women in his life. Around 80 women in Minnesota were impacted by the same perpetrator, and it kicked off Kelley’s quest for justice.
“I've dedicated the past two years of my life to finding a solution to mitigate the harm when it's actually caused, which is at creation,” Kelley told The 19th. “These images don't exist without a third-party involvement and some sort of machine learning model.”
The deepfakes were only stored on the man’s computer, so, Kelley said, no laws banning dissemination, like the Take It Down Act, would apply. (Kelley scoured porn sites looking to see if the images had been shared.) She said that there was no indication of ill intent and that the photos weren’t made consensually, thus ruling out the state’s “revenge porn” law. None of the women was a minor, so possessing the images wasn’t a crime.
Realizing no law would allow her to sue for restitution, Kelley said she began calling everyone she could think of. She eventually connected with Sen. Erin Maye Quade, a member of the Democrat-Farmer-Labor Party, who introduced HF 1606.
The only people Kelley could connect with were school administrators, who have often struggled with how to handle kids victimizing each other with deepfakes. She turned her attention to what she sees as the source of the problem: the technology that created the deepfakes.
“This has taken every spare moment I have,” Kelley said. She has educated lawmakers, given testimony and advocated endlessly for the past two years, all while juggling two kids, a full-time job and law school. She said advocacy is like another full-time job.
Tech legislation can be tricky, but Kelly is confident that HF 1606 will withstand any court challenges if signed into law.
Johnson said RAINN consulted with numerous technology companies to ensure the law would not introduce unintended consequences for general products. This version of the bill includes an exemption from liability for companies where the “technical skill of a user” is required to edit an image, such as with standard tools like Photoshop.
The Trump administration has advocated for federal preemption of state AI laws; if that policy is solidified, this bill could be voided.
In the meantime, Kelley is waiting for the manipulated images of her to surface. After she found out about the deepfakes, she wiped all of her social media.
“Deep down, this is a manipulation and a control issue of women,” she said.



