ST. PAUL — These days the internet is rife with “deepfakes,” artificial intelligence-generated video and sound, often of well-known people doing and saying things they’ve never done.
The technology has been around for a while, but in the past year it has proliferated on a scale never seen before. While many of these fakes are seemingly harmless, like ex-presidents trading vulgar barbs on Joe Rogan’s podcast, there have also been fake sexual videos of people who never consented to their images being used that way. Policymakers are also concerned that fake videos could be used to spread false information that could influence elections.
States such as California and Texas have already introduced new laws regulating the use of AI-generated fakes, and some Minnesota lawmakers are hoping to do the same.
A bill in the Minnesota Legislature would make creating fake pornography of people without their consent a criminal offense and establish penalties for distributing altered images intended to influence an election.
“These technologies make it possible to create highly realistic, yet completely false depictions of people doing and saying things that never actually happened,” Rep. Zack Stephenson, DFL-Coon Rapids, told the House Elections Committee earlier in February. “This has serious implications for privacy, free speech and the integrity of our elections.”
ADVERTISEMENT
After finishing his remarks, Stephenson revealed to committee members that his words had in fact been generated by the AI program Chat GPT, a chatbot that has garnered significant attention in recent months for its ability to generate detailed text responses to prompts ranging from search engine queries to requests to write a persuasive essay.
The program can write poetry, create rap lyrics and was asked by The Associated Press earlier this month to write the State of the Union Address as delivered by figures like Yogi Berra or Elvis Presley ahead of the actual speech from President Joe Biden.
“With the increasing sophistication of these technologies, it's becoming easier to create convincing fake news or propaganda that is designed to manipulate public opinion,” Stephenson continued, reading from the AI-generated speech. “This could potentially have a significant impact on the outcome of elections, undermining the integrity of our democratic process.”
Stephenson, as himself and not an AI, said he was not aware of a specific instance where a deepfake had been used to influence an election in Minnesota, but that there have already been cases of nonconsensual deepfake pornography.
Minnesota already has a statute prohibiting revenge porn, or the nonconsensual distribution of private sexual images, but Stephenson said the current law would likely not apply to pornographic deepfakes. The deepfake bill is modeled after the revenge porn statute, he said.
Under the bill, it would be a gross misdemeanor to distribute without consent altered images of a person as being naked or engaging in a sexual act when the person was not actually naked or engaging in sex. People depicted in sexual deepfakes could also sue the creators for damages and to have the images taken down from the internet. The bill would also establish a felony penalty for knowingly posting a pornographic deepfake to a website , disseminating it for profit, using it to harass a person, or if it's a repeat offense.
On the elections side, it would be a crime to knowingly distribute an altered video to injure a candidate or influence the outcome of an election within 60 days of Election Day. It would be a misdemeanor on the first offense, a gross misdemeanor if it is done with the intent to cause violence, and a felony if it’s the second offense within five years.
It’s not the first time a bill addressing deepfake pornography has been introduced in the Legislature. Stephenson credited Sen. Eric Lucero, R-St. Michael, for introducing a bill while a member of the House.
ADVERTISEMENT
The House Elections Committee referred the bill to the judiciary committee on a voice vote. A companion bill in the Senate has not yet had a hearing but has bipartisan support, including from Lucero and Sen. Erin Maye Quade, DFL-Apple Valley.
Follow Alex Derosier on Twitter @xanderosier or email aderosier@forumcomm.com .