Opinion

Get ready for AI-generated political chaos

By Matthew Gagnon

All things considered it was relatively minor, at least as political shenanigans in the Granite State go. 

On the eve of the New Hampshire primary, some Democratic voters began receiving a strange robocall message from, it sounded like, President Joe Biden. The call directed voters not to vote in the Democratic primary, and instead “save your vote for the November election.” 

In actuality, the call was created by artificial intelligence voice cloning technology that mimicked the president’s voice to the point that it was quite realistic. 

Whoever did it — and it could have been any number of bad actors, from random pranksters, to New Hampshire-based conservative activists, to Biden’s only competition in the primary, Rep. Dean Phillips — was trying to sow chaos in the already chaotic New Hampshire Democratic primary, depressing turnout. 

It was devilishly clever, too, because the messaging was a continuation of the very real and ongoing dispute between the president and the Democratic National Committee and the state of New Hampshire over its schedule. Biden’s name was not on the ballot in New Hampshire and the state is being actively punished by the DNC for failing to obey the national party schedule change. 

Given all this, a simple AI-created robocall of the president talking down the primary and encouraging people not to participate seemed plausible, at the very least. 

This technology involved in creating a message like this has come a long way in a very short amount of time. Today, consumers can download one of dozens of phone apps that clone famous celebrity voices, including politicians, and also give them an opportunity to map and clone their own voice. While not perfect, these AI-driven voices are getting closer and closer to being indistinguishable from the real thing. 

I host a morning radio program in South Portland on Newsradio WGAN, and a few weeks ago, I actually experimented with voice cloning to create a half dozen “endorsements” of my show by famous people. I created messages from Biden, Donald Trump, Taylor Swift, Gordon Ramsay and Snoop Dogg, all of which said nice things about my show and encouraged people to listen. Of course, I didn’t use these as “real” endorsements, disclosed to the audience what I was doing and explained that I was using the experiment to highlight the potential for chaos in our future.  

The scariest thing about my little experiment was how easy it was to make it sound natural. The first time you create one of these voices, it can sound robotic and stilted, mostly because people write in a different way than they talk. I found that by doing simple things that make language sound more conversational — dropping the letter g at the end of a word, or writing a sentence with ellipses rather than commas — went a long way to making it sound more natural. 

If you combined that with a keen observation of the individual you are trying to mimic, making use of tailored language that is common to the spoken phrasing from that person, it is even closer to perfect. When I created Snoop’s “endorsement,” I was even able to build in spelling mistakes that accurately mimicked his gentle southern California drawl. 

The New Hampshire call tried to be clever like this, though they did it poorly. Observing that Biden likes to use goofy words like “malarky,” they had “Biden” say it as the message gave instructions to the voter to stay home. The mistake, of course, is that Biden says things like that extemporaneously during unscripted interactions, not pre-written campaign messages. Ironically, by trying to sound more natural, it actually gave it away as obviously fake. 

Still, this minor and relatively insignificant example of campaign hijinks is likely a harbinger of a coming explosion of this type of tactic. 

Think, for a moment, about all the calls you get for extended warranties on your car or Medicare scams. No matter how many times you opt out of the messages, report them as spam or complain about them, they continue to come. Clearly the government is unable to do much of anything to stop them.

Politics, as an industry, has higher stakes than selling you garbage extended warranties. Power and control of highly influential offices are at stake each and every election cycle, creating a powerful incentive for bad behavior, provided it has low cost, which this behavior clearly does. 

What happens when political activists, campaigns and parties realize this, and start to weaponize AI in not only voice cloning, but image and video creation, and leave the entire electorate confused about who is really saying what?

We’ll find out, because it is about to happen.

Gagnon of Yarmouth is the chief executive officer of the Maine Policy Institute, a free market policy think tank based in Portland. A Hampden native, he previously served as a senior strategist for the Republican Governors Association in Washington, D.C.

Get the Rest of the Story

Thank you for reading your 4 free articles this month. To continue reading, and support local, rural journalism, please subscribe.