Connect with us

Technology

FCC fines consultant $6 million for AI-generated Biden robocall

Published

on

/ 4634 Views

The Federal Communications Commission (FCC) has finalised a $6 million fine against political consultant Steven Kramer for using artificial intelligence to create fake robocalls that mimicked President Joe Biden’s voice, urging voters in New Hampshire not to participate in the Democratic primary.

The fine is part of efforts to crack down on the misuse of AI in political campaigns.

Kramer, a Louisiana-based Democratic consultant, was indicted in May by New Hampshire authorities for sending out the robocalls, which falsely appeared to be from President Biden.

The calls discouraged voters from participating in the state's Democratic primary until November, despite the primary being scheduled for earlier in the year.

Kramer had previously worked for Representative Dean Phillips, a Biden primary challenger, who publicly condemned the robocalls.

In a statement earlier this year, Kramer admitted to paying $500 to have the calls distributed, claiming his goal was to raise awareness about the potential misuse of AI in political campaigns.

The calls used AI-generated deepfake Technology to mimic Biden’s voice, raising concerns about the potential for similar Technology to interfere with elections.

FCC Chair Jessica Rosenworcel stressed the risks posed by AI in political communication, stating, "It is now cheap and easy to use Artificial Intelligence to clone voices and flood us with fake sounds and images... This technology can illegally interfere with elections. We need to call it out when we see it and use every tool at our disposal to stop this fraud."

Kramer faces the $6 million fine for violating FCC rules that prohibit the transmission of inaccurate caller ID information. If he fails to pay the fine within 30 days, the matter will be referred to the Justice Department for collection.

Kramer has not yet commented on the ruling, and attempts to reach him or a spokesperson were unsuccessful.

This case is part of a broader effort by the FCC to regulate AI usage in political campaigns and telecommunications. In August, Lingo Telecom agreed to a $1 million settlement with the FCC for its role in transmitting the New Hampshire robocalls.

As part of the settlement, Lingo committed to complying with FCC caller ID authentication rules.

The FCC is also considering further regulation of AI-generated content in political ads. In July, the commission proposed a rule requiring broadcast political advertisements to disclose if any part of the content was generated using AI.

The proposal is still under review.

The case underscores growing concerns about the impact of AI on election integrity, as deepfake technology becomes more accessible and easier to deploy.

Trending