The administration of U.S. President Joe Biden held a meeting with the heads of technology companies to discuss AI-related risks. BBC reports this.
Attendees included the chief executives of Google, Microsoft, OpenAI and Anthropic. The White House says they have a “moral obligation” to shield the public from the dangers of artificial intelligence.
They told the tech giants that firms must “ensure the safety of their products”. The administration also expressed openness to new rules and laws governing artificial intelligence.
OpenAI CEO Sam Altman told reporters that leaders were “surprisingly united” on the need to regulate the technology.
According to Vice President Kamala Harris, AI may pose risks to security, privacy and civil rights. She also acknowledged the potential benefits that the technology could bring.
Harris said the private sector bears “ethical, moral and legal responsibility for ensuring the safety of its products”.
Biden also stopped by the meeting briefly. The president backed Harris and said AI has “enormous potential and enormous danger”.
Artificial Intelligence is one of the most powerful tools of our time, but to seize its opportunities, we must first mitigate its risks.
Today, I dropped by a meeting with AI leaders to touch on the importance of innovating responsibly and protecting people’s rights and safety. pic.twitter.com/VEJjBrhCTW
— President Biden (@POTUS) May 4, 2023
“I know you understand this. And I hope you can enlighten us about what, in your view, is most needed to protect society and move forward,” the president said.
The White House also announced $140 million in National Science Foundation funding to launch seven new AI research institutes.
In May, the ‘godfather’ of AI Geoffrey Hinton left Google and said he now regrets his work.
In April, Biden called concerns about AI premature.
In March, hundreds of experts signed a letter urging to pause the deployment of the technology for six months.
Later, Microsoft founder Bill Gates opposed the initiative and said it would not solve the problems.
