The chairwoman of the US Federal Communications Commission (FCC), Jessica Rosenworcel, has proposed criminalizing phone calls that utilize artificial intelligence.
Today we announced a proposal to make AI-voice generated robocalls illegal – giving State AGs new tools to crack down on voice cloning scams and protect consumers. https://t.co/OfJUZR0HrG
— The FCC (@FCC) January 31, 2024
“Malefactors are increasingly using AI technologies to mimic human voices through automated calls. We are taking steps to protect consumers from such fraud,” the agency’s statement reads.
According to the FCC, the number of such calls has surged in recent years, as the technologies can “mislead consumers by mimicking the voices of celebrities, political figures, and even family members.”
By implementing the bill, the regulator will provide US attorneys general with “new tools to prosecute the malefactors behind these nefarious robocalls and hold them accountable under the law.”
Regulation of such incidents will be applied in accordance with the Telephone Consumer Protection Act of 1991, which sets standards for political and marketing calls made without the recipient’s consent.
The announcement followed a scandal involving the imitation of US President Joe Biden’s voice. The bot urged people not to vote in the primaries, as it “would allow Republicans to re-elect Donald Trump.” One such conversation was recorded by NBC journalists.
NBC reports that NH voters are getting robocalls with a deepfake of Biden’s voice telling them to not vote tomorrow.
“it’s important that you save your vote for the November election.”https://t.co/LAOKRtDanK pic.twitter.com/wzm0PcaN6H
— Alex Thompson (@AlexThomp) January 22, 2024
The state attorney general’s office issued a statement calling these calls disinformation, advising voters to “completely ignore the content of the messages.” Representatives of Trump deny involvement in the incident.
In July 2023, the UN stated that AI-generated deepfakes threaten the integrity of information and lead to the incitement of hatred in society.
In March, journalist Joseph Cox “fooled” a bank’s voice identifier using a free AI speech synthesis service. According to the Washington Post, fraudsters have increasingly used voice imitation technologies to extort victims’ relatives.
