Telegram (AI) YouTube Facebook X
Ру
61% of Americans say AI threatens humanity

61% of Americans say AI threatens humanity

Most Americans believe that the rapid development of artificial intelligence technologies could threaten humanity’s future. This is according to a joint study Reuters and Ipsos.

\n\n

More than two-thirds of respondents are concerned about negative consequences from AI adoption, and 61% believe it could threaten civilization.

\n\n

22% of respondents did not agree with this assertion, and 17% could not answer the question.

\n\n

The supporters of former U.S. President Donald Trump expressed greater concern. 70% of them agreed that AI could threaten humanity. Among Joe Biden’s voters, 60% held that view.

\n\n

Religiously, 32% of evangelicals “completely agreed” that AI threatens civilization. 24% of adherents of other Christian denominations share that view.

\n\n

“This shows that many Americans are concerned about the negative consequences of AI,” said the director of US policy at the Future of Life Institute, Landon Klein.

\n\n

According to him, the current situation is comparable to the dawn of the nuclear era, but there is now an opportunity to study public opinion and take necessary measures.

\n\n

Earlier, researchers from The Tech Oversight Project found that a majority of Americans support the US Congress’s efforts to curb AI.

\n\n\n\n

Earlier in March, Goldman Sachs analysts said that generative AI threatens about 300 million jobs in developed countries.

\n\n

In the same month, OpenAI researchers assessed the impact of GPT models on the labor market in the United States. According to them, the technology automates up to 10% of tasks for 80% of Americans.

\n\n

In May, billionaire investor Warren Buffett compared artificial intelligence with the creation of an atomic bomb.

Подписывайтесь на ForkLog в социальных сетях

Telegram (основной канал) Facebook X
Нашли ошибку в тексте? Выделите ее и нажмите CTRL+ENTER

Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!

We use cookies to improve the quality of our service.

By using this website, you agree to the Privacy policy.

OK