Company to Pay $1M for AI Calls Mimicking Biden
Company to Pay $1M for AI Calls Mimicking Biden
Introduction
A telecommunications company that facilitated deceptive robocalls to New Hampshire voters using artificial intelligence to mimic President Joe Biden's voice has agreed to pay a $1 million fine, according to federal regulators.
Lingo Telecom, the voice service provider responsible for transmitting the calls, settled with the Federal Communications Commission (FCC) to resolve the enforcement action against it. The FCC initially sought a $2 million fine but agreed to the lower amount in the settlement.
This case has raised concerns about the use of AI in political campaigns and its potential impact on democracy. The robocalls, sent to thousands of New Hampshire voters on January 21, featured a voice that closely resembled President Biden's.
The message falsely claimed that participating in the state's presidential primary would prevent voters from casting ballots in the November general election. The misleading calls were orchestrated by a political consultant who now faces a proposed $6 million fine from the FCC, as well as state criminal charges.
The consultant, who hired a magician and self-described "digital nomad" to create the AI-generated recording, has previously stated that he was not attempting to influence the primary's outcome. Instead, he claimed his goal was to demonstrate the potential dangers of AI and prompt lawmakers to take action. If convicted, the consultant could face up to seven years in prison for voter suppression and an additional year for impersonating a candidate.
As part of the settlement, Lingo Telecom has agreed to implement stricter caller ID authentication rules and requirements. The company is also required to more thoroughly verify the accuracy of the information provided by its customers and upstream providers. The FCC emphasized the importance of ensuring that the voice on the line is genuinely who it claims to be, especially in the context of AI usage.
FCC Chairperson stated that transparency is crucial, and the FCC will take action whenever trust in communications networks is compromised.
Lingo Telecom has not responded to requests for comment. Previously, the company expressed strong disagreement with the FCC's actions, arguing that the commission was attempting to impose new rules retroactively.
The case has drawn praise from consumer advocacy groups. Public Citizen, a nonprofit organization, commended the FCC for its response. The group's co-president, echoed the chairperson's stance, stressing that consumers have the right to know when they are receiving authentic content versus AI-generated deepfakes. He also warned that such deepfakes pose a significant threat to democracy.
The FCC's Enforcement Bureau Chief highlighted the dangers of combining caller ID spoofing with generative AI voice-cloning technology. He noted that this combination could be exploited by both domestic operatives seeking political advantage and foreign adversaries aiming to interfere in elections or conduct malign influence activities.