Georgia wants to ban AI deepfakes in political campaigns

By | March 20, 2024

When grappling with legislation, sometimes it’s best to get an issue figured out in front of you.

In Georgia it sounds like senator Colton Moore. But it alone sounds like Colton Moore.

Rep. Todd Jones, a Republican who chairs the Georgia House Committee on Technology and Infrastructure Innovation, has proposed legislation that would ban the use of artificial intelligence deepfakes in political communications. To illustrate this point, Jones presented a deepfake video to the Judiciary Committee using an AI image and audio of Moore and Mallory Staples, a former Republican congressional candidate who now heads a far-right activist organization, the Georgia Freedom Caucus.

Related: Internet providers have left rural Americans behind. One province is fighting back

The video uses an AI tool to mimic the voices of Moore and Mallory incorrectly endorsing the bill’s passage. The video contains a running disclaimer at the bottom that quotes the text of the bill.

Moore and Mallory oppose the legislation.

The AI ​​impersonation of Moore says: “I would like to ask the committee: How do I use my biometric data, such as my voice and likeness, to create media that supports a policy that I clearly disagree with on someone else’s First Amendment ?”

The video continues: “The overwhelming number of Georgians believe that using my personal characteristics against my will is fraud, but our laws currently do not reflect that. If AI can be used to make Colton Moore speak in favor of a popular piece of legislation, it can be used to make any of you say things you’ve never said before.”

Rep. Brad Thomas, the bill’s Republican co-sponsor and co-author of the video, said he and his colleagues used widely available tools to create the video.

“The particular one we used cost about $50. With a $1,000 version, your own mother wouldn’t be able to tell the difference,” he said.

The pace of advancement of visual AI generative tools is years ahead of the legislation needed to prevent misuse, Thomas said: “Cinematic-style video. Those individuals look absolutely real, and they are AI-generated.”

The bill passed the committee on an 8-1 vote.

Moore is unpopular in Georgia legislative circles. His Senate colleagues threw him out of the Republican caucus in September, accusing him of making false statements about other conservatives while fruitlessly advocating for a special session to remove Fulton prosecutor Fani Willis from office.

Last week, Moore was permanently barred from the Georgia House chamber after rhetorically attacking the late speaker during a memorial service held on the House floor.

Through the Georgia Senate press office, Moore declined to comment.

In social media posts, Moore has opposed this bill, which he says is an attack on “memes” used in political discourse, and that satire is protected speech.

Staples, in newsletters to her supporters, cited Douglass Mackey’s federal conviction last year as an example of potential harm. Mackey, also known as alt-right influencer “Rickey Vaughn,” sent mass text messages in November 2016 encouraging Black recipients to “vote by text” instead of casting an actual vote, with the text messages claimed they had been paid by the Clinton. campaign.

Federal judges rejected Mackey’s First Amendment arguments on the grounds that the communications amounted to fraud that was not constitutionally protected. Mackey was sentenced to seven months in October.

House Bill 986 creates the crimes of fraudulent election interference and solicitation of fraudulent election interference, with penalties of two to five years in prison and fines of up to $50,000.

If someone publishes, broadcasts, streams or uploads materially misleading media within 90 days of an election – defined as portraying a real individual’s speech or behavior that did not actually occur and would appear authentic to a reasonable person – then guilty of a criminal offense as long as the media in question significantly affects a candidate or referendum’s chances of winning, or confuses the administration of that election. It would therefore also criminalize the use of deepfakes used to sow doubt about the outcome of elections.

Deepfakes was in on the 2024 election from the start, with an AI-generated audio call in which Joe Biden told voters in New Hampshire not to vote. After the call, the Federal Communications Commission announced a ban on robocalls that use AI audio. But the Federal Election Commission has yet to implement rules for political ads that use AI, something watchdog groups have been calling for for months. Regulations are lagging behind the reality of AI’s potential to mislead voters.

In the absence of federal election rules for AI content, states have stepped in, introducing and, in several cases, passing bills that would typically require labels on political ads that use AI in some way. Without these labels, AI-generated content in political ads is considered illegal in most bills introduced in states.

Experts say AI audio in particular has the ability to mislead voters because a listener loses contextual clues that could alert them to a video being fake. Audio deepfakes of prominent figures, such as Trump and Biden, are easy and cheap to create using readily available apps. For lesser-known people who often speak in public and have a large number of samples of their voice, such as speeches or media appearances, people can upload these samples to train a deepfake clone of the person’s voice.

Enforcement of Georgian law can be challenging. Lawmakers struggled long before the rise of AI to find ways to rein in anonymous flyers and robocalls spreading misinformation and fraud in the run-up to the election.

“I think that’s why we gave concurrent jurisdiction to the attorney general’s office,” Thomas said. “One of the other things we’ve done is allow the [Georgia Bureau of Investigation] investigate election issues. With the horsepower of these two organizations, we have the best chance of finding out who did it.”

Lawmakers are only just beginning to understand the implications of AI. Thomas expects more legislation to come in the next few sessions.

“Fraud is fraud, and that is what this bill amounts to,” Thomas said. “That is not a First Amendment right for anyone.”

Rachel Leingang contributed to the reporting

Leave a Reply

Your email address will not be published. Required fields are marked *