Minnesota’s House passed a bill aimed at shutting off access to AI “nudification” tools that can turn ordinary photos into realistic non-consensual nude or sexual images, including child sexual abuse material, while one Republican lawmaker cast the only vote against it.
HF1606, titled “Nudification technology access prohibited,” passed the Minnesota House by a 132-1 vote and now moves to the Senate. The bill was authored by Rep. Jess Hanson and targets websites, apps, software, programs, or services that allow users to “nudify” images or videos.
The bill defines “nudify” as altering or generating an image or video to depict an intimate part not shown in the original image or video of an identifiable person, where the result is realistic enough that a reasonable person would believe the intimate part belongs to that person. It also bans operators from nudifying an image or video on behalf of a user.
HF1606 is not written as a new criminal statute but creates civil liability and enforcement tools against people or entities that own, control, advertise, promote, or provide access to nudification technology. The bill also includes an exemption for tools that require “substantial” human technological or artistic skill and judgment, language designed to distinguish automated “push-button” nudifiers from more controlled creative software.
Victims could sue in district court for compensatory damages, including mental anguish, in an amount up to three times actual damages. The bill also allows punitive damages, injunctive relief, attorney fees, costs, and other court-approved relief.
The Minnesota attorney general would be allowed to enforce the law, with civil penalties of up to $500,000 for each unlawful access, download, or use. Penalty proceeds would be routed into grants for organizations providing direct services and advocacy for victims of sexual assault, general crime, domestic violence, and child abuse.
The measure would take effect August 1, 2026, for causes of action accruing on or after that date.
🚨 Minnesota House passes bill banning use of AI to generate nude images of children.
— Democrats Deliver (@DemzDeliver) April 28, 2026
One Republican voted no. pic.twitter.com/VX8fTtqYMg
Who voted no?
The lone dissent came from Rep. Drew Roach, a Republican from Farmington. Roach said material disseminated through nudification tools is “disgusting” and “vile,” and that victims deserve accountability and justice, but argued the bill does not address the root cause because someone with technical skill could still create the same material without using a covered tool.
“What we’re going to do here is we’re going to attack a software, a manufacturer and instead, shifting our focus on that instead of the perpetrators of these crimes,” Roach said. “If we want to prevent this from happening in the future, we should go after those perpetrators with the full force of the law.”
Hanson countered that the bill targets content creation and that existing laws are “not strong enough to catch a lot of them.”
The debate comes as lawmakers across the US test different strategies for AI-generated sexual imagery. AP reported last year that most state efforts had focused on banning dissemination of sexually explicit deepfakes or revenge porn, while Minnesota’s approach seeks to stop the material before it spreads by targeting operators of nudification sites and apps.
There are constitutional concerns from AI law experts over free speech and federal platform immunity, while supporters argue the bill regulates conduct rather than protected expression.
A proposed amendment requiring age verification for websites where at least 33.3% of content is harmful to minors failed along party lines. Hanson opposed adding it, arguing it would deflect from the core bill’s focus on non-consensual AI-generated intimate trauma.
Information for this story was found via the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.