AI deepfakes fool voters and politicians ahead of 2024 US elections — ‘I thought it was real’

The U.S. appears unprepared for the onslaught of AI-generated imitations despite years of warnings from think tanks.

AI deepfakes fool voters and politicians ahead of 2024 US elections — ‘I thought it was real’

Citizens in New Hampshire received an unusual political request over the weekend of Jan. 20–21. Robo-calls featuring what sounded to many like United States President Joe Biden’s voice told them not to vote in the Jan. 23 primary.

The automated messages were apparently generated by an artificial intelligence (AI) deepfake tool with the apparent purpose of meddling in the 2024 presidential election.

NH voters are getting robocalls from Biden telling them not to vote tomorrow.

Except it’s not Biden. It’s a deepfake of his voice.

This is what happens when AI’s power goes unchecked.

If we don’t regulate it, our democracy is doomed.pic.twitter.com/8wlrT63Mfr

— Public Citizen (@Public_Citizen) January 22, 2024

Per audio recorded by NBC, residents were told to stay home during the primary:

“Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again. Your vote makes a difference in November, not this Tuesday.”

The state’s attorney general’s office issued a statement denouncing the calls as misinformation, adding that “New Hampshire voters should disregard the content of this message entirely.” Meanwhile, a spokesperson for former President Donald Trump denied any involvement from the GOP candidate or his campaign.

Related: Trump ‘will never allow’ CBDC, gives ‘full credit’ to Vivek Ramaswamy

Investigators don’t appear to have identified the source of the robocalls yet, but investigations are ongoing.

In related news, another political scandal related to deepfake audio transpired over the weekend when AI-generated audio imitating Manhattan Democrat leader Keith Wright emerged on Jan. 21. The deepfake audio featured an imitation of Wright’s voice trash-talking fellow Democratic Assembly member Inez Dickens.

According to a report from Politico, the audio was dismissed as fake by some, but at least one political insider was momentarily convinced it was real.

Manhattan Democrat and former City Council Speaker Melissa Mark-Viverito told Politico that, at first, they thought the fakes were credible:

“I was like ‘oh shit.’ I thought it was real.”

Experts believe bad actors have chosen audio fakes over video because consumers tend to be more discerning when it comes to visual fakery. As AI advisor Henry Ajder recently told the Financial Times, “everyone’s used to Photoshop or at least knows it exists.”

There doesn’t appear, as of the time of this article’s publishing, to be any universal method by which deepfakes can be detected or deterred. Experts recommend exercising caution when engaging in media from unknown or dubious sources, especially when extraordinary claims are in play.

Related Articles

Responses