FCC levies fines for robocall campaign that used AI-generated Biden voice clone
The political consultant and voice service provider allegedly behind a robocall campaign that used artificial intelligence to clone President Joe Biden’s voice have been fined a combined $8 million, the Federal Communications Commission (FCC) announced Thursday.
The agency said the calls, placed two days before the New Hampshire primary and featuring a fake Biden voice asking voters not to turn out, violated the Truth in Caller ID Act for illegally “spoofing" the incoming numbers received by phone owners. The technique deliberately changes the number that appears on a recipient’s caller ID for deceptive purposes.
FCC rules prohibit “knowingly causing the transmission of inaccurate caller ID information with the intent to defraud, cause harm or wrongly obtain anything of value,” the agency said in a press release.
In addition to a $6 million fine, political consultant Steve Kramer was also indicted on 13 felony counts of voter suppression and 13 misdemeanor counts of impersonation of a candidate on Thursday, according to New Hampshire's attorney general.
Kramer has the right to respond with evidence before the FCC fine becomes official.
A message to Kramer’s LinkedIn account was not immediately responded to and a phone number and email for him could not be located.
The calls were ultimately placed by voice service provider Lingo Telecom, which according to the FCC inaccurately labeled the calls, marking them with “the highest level of caller ID attestation, making it less likely that other providers could detect the calls as potentially spoofed.”
Full caller ID attestation means the service provider has fully authenticated the caller and phone number.
Lingo Telecom is accused of breaking the FCC’s so-called STIR/SHAKEN rule for not verifying the caller ID information provided by Kramer and has been fined $2 million.
The agency said caller ID authentication is an “essential tool” and “serves as a digital identifier for each call to empower tracebacks of suspicious calls, inform robocall blocking tools, and support more reliable caller ID information for consumers.”
It called the fines targeting Lingo for failing to authenticate caller IDs a “first-of-its-kind enforcement action.”
The New Hampshire Attorney General’s Office and the FCC ordered Lingo to stop transmitting suspicious calls in February.
A spokesperson for Lingo did not immediately respond to an emailed request for comment.
While the FCC called the fines announced Thursday “substantial,” they are small compared to some previous enforcements, particularly a $300 million fine the agency issued in August to a vast global robocall network it said made billions of auto-warranty scam calls over the past several years.
The FCC’s action against Kramer and Lingo is likely meant to send a message on the eve of election season, when voters are expected to be flooded with AI-generated robocalls, experts said.
“Obviously, the issue of deepfakes is new and I would hope the FCC sends a strong signal,” said John Bergmayer, legal director at Public Knowledge, an advocacy group that closely tracks the agency.
Bergmeyer said it is easier for the FCC to determine appropriate fines when robocalls lead directly to financial losses, but they can be an effective lever to stop bad behavior even when consumers don’t lose money.
“Here, I would hope the FCC has deterrence in mind,” he said.
The FCC made robocalls using AI-generated voice cloning illegal in February, saying it would use the 1991 Telephone Consumer Protection Act (TCPA), which prohibits robocalls deploying pre-recorded and artificial voice messages, to enforce the rule.
Suzanne Smalley
is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.