Telegram (AI) YouTube Facebook X
Ру
Survey: 66% of risk managers regard generative AI as a threat to organisations.

Survey: 66% of risk managers regard generative AI as a threat to organisations.

Generative artificial intelligence (AI) has become one of the main risks for organisations, according to a Gartner poll. It was mentioned by 66% of respondents.

The survey was conducted among 249 chief risk officers in the second quarter of 2023. Generative AI entered this ranking for the first time, Gartner noted.

The top five risks for organisations by frequency of mention were:

  • solvency of counterparties (67%);
  • the widespread availability of generative AI (66%);
  • uncertainty in financial planning (62%);
  • concentration of operations in the cloud (62%);
  • tensions in trade with China (56%).

In light of the development of generative AI, Gartner analysts identified three segments to keep under control from a corporate risk-management perspective: intellectual property, data privacy and cybersecurity.

“This reflects the rapid growth of public awareness and the adoption of generative AI tools, as well as the broad range of potential uses and, therefore, the potential risks from these tools,” said Zhan Xu, Director of Research for Gartner Risk & Audit Practice.

In February, Apple co-founder Steve Wozniak advised to exercise caution regarding ChatGPT. In his words, the chatbot ‘is quite impressive’ and ‘useful for people.’ However, it poses a number of serious problems.

In March, more than a thousand AI experts urged pausing the development of large language models for six months. Two days later, another group of experts criticized the initiative.

In April, Microsoft co-founder Bill Gates opposed calls to pause AI development due to concerns about risks to society.

Подписывайтесь на ForkLog в социальных сетях

Telegram (основной канал) Facebook X
Нашли ошибку в тексте? Выделите ее и нажмите CTRL+ENTER

Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!

We use cookies to improve the quality of our service.

By using this website, you agree to the Privacy policy.

OK