
US urged to ban autonomous killer robots
The nonprofit The Future of Life Institute (FLI) released a short film calling for a ban on the use of “killer robots.” The video highlights the risks associated with autonomous weapons and steps that can be taken to prevent their spread.
The video “Slaughterbots — if human: kill()” is presented in the format of a news bulletin. The authors highlighted several scenarios in which robots could liquidate people:
- bank robbery;
- attacks on police stations;
- hunt for military personnel.
The authors also demonstrated autonomous robodogs with weapons on their backs.
“In the case of a drone, the decision to strike is made by a human operator, while autonomous weapons determine whom to spare and whom to kill,” the video says.
FLI, in collaboration with the International Committee of the Red Cross, proposed banning algorithms from autonomously deciding to eliminate targets. They named four reasons for restricting such systems:
- The probability of conflict escalation due to the broad spread of AI;
- The mass deployment of autonomous weapons due to the low cost of the technology;
- The unpredictability of machine learning algorithms in real combat conditions;
- The selectivity of robots in target determination.
“We urgently need new international law prohibiting autonomous weapons from eliminating people and imposing restrictions on other types of AI-enabled weaponry,”
the closing credits say.
Earlier in December, at the opening of the Review Conference of the Convention on “inhumane” weapons in Geneva, UN Secretary-General António Guterres called for measures against ‘killer robots’.
China backed the head of the organization and was among the first to spoke out against the use of AI in military purposes.
By the end of December, the countries failed to reach a consensus on regulating autonomous weapons and agreed to continue discussions.
Subscribe to ForkLog news on Telegram: ForkLog AI — all the news from the world of AI!
Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!