American Congresswoman Yvette D. Clarke introduced a bill requiring disclosure of AI-content usage in political advertising. The Washington Post reports.
The document argues that reforms are needed because of the potential ‘use of generative AI that harms democracy’.
In her words, the immediate trigger for the bill was a video by the Republican National Committee (RNC). It used generated imagery to illustrate a dystopian vision of a potential second term for President Joe Biden.
\n\n\n\n
The 30-second clip appeared after Biden announced his intention to run for a second term. It depicts fake images of Chinese aggression toward Taiwan, as well as immigrants at the southern border of the United States.
\n\n\n\n
The video authors also included a warning that the clip ‘was entirely created using AI-generated images’.
\n\n\n\n
According to Clarke, disclosure made the RNC advertisement ‘an exemplar of transparent deployment of artificial intelligence’. But she is confident not all content creators will follow suit.
\n\n\n\n
The bill would amend the Federal Campaign Finance Act. If enacted, the authors of political advertising would be obliged to disclose information about any use of AI in content creation.
\n\n\n\n
Clarke acknowledged that the technology has important applications.
\n\n\n\n
‘But there must be certain rules of conduct so that the American people are not deceived and do not face danger,’ Clarke added.
\n\n\n\n
In May, American lawmakers introduced a banning AI from launching nuclear weapons bill.
\n\n\n\n
In April, Biden was urged to speed up AI regulation.
\n\n\n\n
In the same month, Senator Chuck Schumer presented a framework for regulating the development, deployment and use of advanced machine-learning technologies.
