ST. PAUL, Minn. (AP) 鈥 Molly Kelley was stunned to discover in June that someone she knew had used widely available to create highly realistic and sexually explicit videos and images of her, using family photos that were posted on social media.
鈥淢y initial shock turned to horror when I learned that the same person targeted about 80, 85 other women, most of whom live in Minnesota, some of whom I know personally, and all of them had connections in some way to the offender,鈥 Kelley said.
Backed by her testimony, Minnesota is considering a new strategy for . A bill that has bipartisan support would target companies that run websites and apps allowing people to upload a photo that then would be transformed into explicit images or videos.
States across the country and Congress are considering strategies for regulating artificial intelligence. Most have banned the dissemination of sexually explicit deepfakes or revenge porn whether they were produced with AI or not. The idea behind the Minnesota legislation is to prevent the material from ever being created 鈥 before it spreads online.
Experts on AI law caution the proposal might be unconstitutional on free speech grounds.
Why advocates say the bill is needed
The lead author, Democratic Sen. Erin Maye Quade, said additional restrictions are necessary because AI technology has advanced so rapidly. Her bill would require the operators of 鈥渘udification鈥 sites and apps to turn them off to people in Minnesota or face civil penalties up to $500,000 鈥渇or each unlawful access, download, or use.鈥 Developers would need to figure out how to turn off the function for Minnesota users.
It鈥檚 not just the dissemination that鈥檚 harmful to victims, she said. It鈥檚 the fact that these images exist at all.
Kelley told reporters last month that anyone can quickly create 鈥渉yper-realistic nude images or pornographic video鈥 in minutes.
Most law enforcement attention so far has been focused on distribution and possession.
Congress, states and cities are also trying other tactics
San Francisco in August against several widely visited 鈥渘udification鈥 websites, alleging they broke state laws against fraudulent business practices, nonconsensual pornography and the sexual abuse of children. That case remains pending.
The U.S. Senate last month unanimously approved a bill by Democrat Amy Klobuchar, of Minnesota, and Republican Ted Cruz, of Texas, to make it a federal crime to publish nonconsensual sexual imagery, including AI-generated deepfakes. Social media platforms would be required to remove them within 48 hours of notice from a victim. Melania Trump on Monday since becoming first lady again to urge passage by the Republican-controlled House, where it's pending.
The Kansas House last month approved a bill that expands the definition of illegal sexual exploitation of a child to include possession of images generated with AI if they're 鈥渋ndistinguishable from a real child, morphed from a real child鈥檚 image or generated without any actual child involvement.鈥
A bill introduced in the Florida Legislature creates a new felony for people who use technology such as AI to generate nude images and criminalizes possession of child sexual abuse images generated with it. Broadly similar bills have also been introduced in Illinois, Montana, New Jersey, New York, North Dakota, Oregon, Rhode Island, South Carolina and Texas, according to an
Maye Quade said she'll be sharing her proposal with legislators in other states because few are aware the technology is so readily accessible.
鈥淚f we can鈥檛 get Congress to act, then we can maybe get as many states as possible to take action,鈥 Maye Quade said.
Victims tell their stories
Sandi Johnson, senior legislative policy counsel for the victim鈥檚 rights group RAINN 鈥 the Rape, Abuse and Incest National Network 鈥 said the Minnesota bill would hold websites accountable.
鈥淥nce the images are created, they can be posted anonymously, or rapidly widely disseminated, and become nearly impossible to remove,鈥 she testified recently.
Megan Hurley also was horrified to learn someone had generated explicit images and video of her using a 鈥渘udification鈥 site. She said she feels especially humiliated because she's a massage therapist, a profession that's already sexualized in some minds.
鈥淚t is far too easy for one person to use their phone or computer and create convincing, synthetic, intimate imagery of you, your family, and friends, your children, your grandchildren,鈥 Hurley said. 鈥淚 do not understand why this technology exists and I find it abhorrent there are companies out there making money in this manner.鈥
AI experts urge caution
However, two AI law experts 鈥 Wayne Unger of the Quinnipiac University School of Law and Riana Pfefferkorn of Stanford University's Institute for Human-Centered Artificial Intelligence 鈥 said the Minnesota bill is too broadly constructed to survive a court challenge.
Limiting the scope only to images of real children might help it withstand a First Amendment challenge since those are generally not protected, Pfefferkorn said. But she said it would still potentially conflict with a federal law that says you can't sue websites for content that users generate.
鈥淚f Minnesota wants to go down this direction, they'll need to add a lot more clarity to the bill,鈥 Unger said. 鈥淎nd they'll have to narrow what they mean by nudify and nudification.鈥
But Maye Quade said she thinks her legislation is on solid constitutional ground because it's regulating conduct, not speech.
鈥淭his cannot continue," she said. "These tech companies cannot keep unleashing this technology into the world with no consequences. It is harmful by its very nature.鈥
___
Associated Press reporters Matt O'Brien, John Hanna and Kate Payne contributed to this story from Providence, Rhode Island; Wichita, Kansas; and Tallahassee, Florida, respectively.
___
This story has been corrected to show the spelling of Molly Kelley's last name is Kelley, not Kelly.
Steve Karnowski, The Associated Press