IE 11 is not supported. For an optimal experience visit our site on another browser.

Fake Biden robocall telling Democrats not to vote is likely an AI-generated deepfake

The state's attorney general's office said the call was probably an effort at voter suppression ahead of Tuesday's primary.
Joe Biden in Phoenix
President Joe Biden in Phoenix on Dec. 6, 2022. Patrick Semansky / AP file

A recent robocall from a fake President Joe Biden telling New Hampshire residents not to vote was almost certainly created with artificial intelligence, according to disinformation experts and people who study the technology.

The call, which the New Hampshire attorney general’s office has described as an apparent “unlawful attempt” to suppress voters from writing in Biden’s name in the state’s Democratic presidential primary Tuesday, is of unknown origin. Experts say it appears to be a deepfake — fake audio or video created with AI and designed to mimic real people, usually without their knowledge or consent.

“All signs point to it being a deepfake,” said Ben Colman, the CEO of Reality Defender, a company that creates software to test media files to see whether they appear artificially generated.

“We never say anything’s 100% certain, because we do not have the ground truth, but it’s highly likely manipulated,” Colman said.

The voice on the robocall, first obtained by NBC News, sounds like Biden’s, though the cadence is clipped. It’s nearly impossible to pin down which AI program would have created the audio; programs that can create a moderately convincing replication of someone’s voice are widely available as phone apps and online services, usually for free or only a small fee.

Such programs often need only a small amount of sample audio to replicate people’s voices, making it trivially easy for them to mimic a politician.

Lindsay Gorman, who studies emerging technologies and disinformation at the German Marshall Fund’s Alliance for Securing Democracy, said there often are tells in deepfakes, though the technology is constantly getting better.

“The cadence, particularly towards the end, seemed unnatural, robotic. That’s one of the tipoffs for a potentially faked piece of audio content,” she said.

Clearly identifying what is or isn’t a deepfake is becoming a game of cat and mouse with developers improving the technology, Gorman said.

“In the case of visual deepfakes, one signature is to be able to spot people’s eyes,” she said. “That’s generally a good tipoff, that if their eye movements are unnatural, then that could be a deepfake. But now deepfakes software has gotten better and has addressed some of the weird erratic eye movements: staring, too much blinking.”

Sen. Richard Blumenthal, D-Conn., who introduced an AI legislation framework in the Senate, said he hopes the incident alerts Americans to the disinformation dangers AI can pose.

“It can be mastered in literally minutes by a neophyte. So unfortunately and tragically, that’s our future unless we act decisively,” Blumenthal told NBC News.

“I hope that the Biden deepfake, or imitation, will actually be a shock to the system and perhaps alert everyone that in fact everyone is at risk,” he said.

While federal law criminalizes knowing attempts to keep people from voting or registering to vote, there is little regulation about using AI deceptively. While some activists have pressured the Federal Election Committee to regulate deepfake ads, it still has yet to determine whether it will begin any rulemaking process around the technology, a spokesperson said.

Mekela Panditharatne, the senior counsel at the Democracy Program at the Brennan Center for Justice at the New York University School of Law, said that while the apparent use of deepfake technology to keep people from voting in a U.S. presidential primary may be new, using robocalls to suppress votes isn’t.

“Robocalls have historically been deployed by deceptive actors to spread false information about how, when and where to vote in an attempt to trick people out of voting,” Panditharatne said. “Voice-generation AI potentially makes that method of attempted vote suppression more attractive to fraudsters.”