FCC chair proposes requirement for political ads to disclose when AI content is used
The leader of the Federal Communications Commission (FCC) wants the agency to examine whether it should force campaigns and political action committees to flag when political ads on radio and TV include content produced by artificial intelligence.
Chairwoman Jessica Rosenworcel said on Wednesday that she has circulated the proposal with her four fellow commissioners. If they support her idea, the agency will formally begin proceedings to study making the change.
The proposal comes on the heels of an AI-generated wave of robocalls mimicking President Joe Biden’s voice and telling voters not to turn out in the New Hampshire primary in January. In February, the FCC announced it would make AI-generated voice cloning in robocalls illegal.
AI also has been used extensively in political campaigns bombarding Indians turning out for the country’s elections, which end June 1. Politicians there have used AI to create propaganda, crude audio fakes and AI-generated satire.
If the other FCC commissioners agree with the proposal, the agency will first ask the public for input on whether broadcasters should be required to make on-air and written disclosures when political candidates and groups use AI to make ads. It also will seek public comment on how to define AI-generated content.
Rosenworcel is proposing that both candidate and issue advertisements be subject to the rule and wants the new requirements to apply to a wide range of entities in the broadcasting ecosystem, including cable operators, satellite TV and radio providers. The proposal also would apply to section 325(c) permittees, or those sending programming to foreign stations.
“As artificial intelligence tools become more accessible, the Commission wants to make sure consumers are fully informed when the technology is used,” Rosenworcel said in a statement emailed to reporters.
AI is anticipated to play a major role in the design of political ads in the 2024 cycle and could deceive voters, experts say, particularly if so-called deepfakes are deployed.
Deepfakes are images, video and audio recordings that trick audiences into believing people have said or done something that they did not.
In making her announcement, Rosenworcel said the Bipartisan Campaign Reform Act gives the FCC the authority to regulate political advertising.
Suzanne Smalley
is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.