Site iconSite icon ForkLog

Drone kills a person; robot enrols at university and other AI news

Drone kills a person; robot enrols at university and other AI news

We strive to inform readers not only about developments in the bitcoin industry but also about the broader technology sphere — cybersecurity, and now the world of artificial intelligence (AI).

ForkLog has compiled the most important AI news from the past week.

  • The UN recorded the first-ever instance of a combat drone killing a human without a direct order from an operator.
  • The Israeli army deployed artificial intelligence and supercomputers during the Gaza Strip conflict.
  • Sberbank developed technology to identify a person armed with a weapon through video surveillance cameras.
  • In the Moscow region, AI will monitor schoolchildren.
  • NVIDIA will open access to its supercomputer for machine-learning developers, at $90,000 per month.
  • A new mathematical brain model was presented to aid creating advanced AI.
  • Researchers created fabric embedded with artificial intelligence.
  • In China, a humanoid robot was enrolled at a university.
  • Engineers created a system enabling a robot face to express emotions.
  • A musician and developer built a universal AI-powered effects pedal that imitates almost any electric guitar sound.

Combat drone kills a person without direct order from an operator

The combat drone carried out its first attack on a person without a direct order, reported by the media, citing a UN report.

The KARGU-2 attack quadcopter, operating autonomously, attacked a person during clashes between rival groups in the Libyan civil war in 2020. The drone autonomously targeted a Libyan army soldier and, without an operator’s order, fired a fatal shot.

Military experts confirmed that this is the first known instance of a drone autonomously deciding to neutralise a target.

Human Rights Watch called for a ban on developing, producing and using autonomous weapons. The rights group argues it would spell the end of killer robots.

Israel used AI in armed conflict in the Gaza Strip

The Israeli army became the first in the world to apply artificial intelligence and supercomputers in real combat conditions. This refers to the recent conflict in the Gaza Strip.

According to the military, with data from optical, electronic and human intelligence, AI developed recommendations that troops used to identify targets and strike them.

Another ML algorithm used the information to warn troops in the field of possible enemy attacks.

The military also said they relied on intelligence data to carry out targeted strikes, aiming to minimise civilian casualties.

Sberbank presents algorithm for recognising armed people

The Sberbank laboratory created AI to identify armed individuals on video in real time.

Developers said the neural network learned not only to identify a pistol in a person’s hand but also to distinguish it from props. They say the technology will enhance safety in various institutions.

“If the algorithm were applied, say, in a school, we could automatically receive a signal without phone calls that there is a risk of a terrorist attack,” the company said.

In Russian schools, AI will monitor children

In one of the Moscow region schools, the AI system will be tested for monitoring the safety of children and the condition of infrastructure.

AI-enabled cameras will learn to recognise atypical behaviour, such as running in corridors or riding along railings. In such cases the system will notify responsible staff so they can take the necessary measures.

According to the developers, all technologies used were Russian. They also emphasised that the system does not make decisions. It is designed to inform responsible personnel about potential incidents.

NVIDIA will open subscription access to the supercomputer

NVIDIA will open access to its cloud supercomputer DGX SuperPOD to AI developers on a subscription basis at $90,000 per month.

Users will receive a turnkey data-centre solution to boost the performance of their infrastructure.

DGX SuperPOD provides 100 petaflops of compute. Similar supercomputers are used at the German AI research centre for satellite and aerial imagery analysis, and at the University of Florida for molecular modelling of proteins with quantum precision.

The service will be available in the summer of 2021.

AI that works like the human brain

A group of researchers from Google Research developed AI that operates like the human brain.

They described in detail a mathematical model divided into a finite number of regions. Each has several million neurons connected within and across other regions.

According to the scientists, the model provides randomness, plasticity and inhibition. This means neurons can connect randomly, change connections during training and excite a limited number of neurons.

The researchers have already assembled the first model capable of performing a set of operations to store, retrieve and process information.

MIT develops digital fabric with AI

Engineers at the Massachusetts Institute of Technology unveiled digital fiber with memory and AI. It is flexible, can be woven into fabric and even washed.

Fabric from digital fiber. Data: MIT.

To do this they embedded a hundred silicon microchips and digital circuits into a preform, from which they wove a polymer thread. It can store, for example, a 767-kilobit video file for two months without charging.

According to the researchers, AI in the fiber will enable collecting and processing data to extract new body models that scientists had not previously known.

A robot student enrolled at a Chinese university

At the prestigious Tsinghua University a humanoid robot named Hua Zhijin was enrolled, a robot student who looks indistinguishable from humans. She will study in the computer science department.

Robot student Hua Zhijin. Data: screenshot from a demonstration video.

According to the professor overseeing her training, Zhijin can compose poetry, draw and write music. In a year she will reach the development level of a 12-year-old, the professor added.

The aim of the experiment is not only to train the robot in practical skills but also to teach it to interact with people and express emotions.

Engineers taught the robot to express emotions

Engineers taught the robot EVA to respond to human facial expressions with various emotions using AI.

EVA can display anger, disgust, fear, joy, sadness and surprise. It can also show more nuanced emotions using artificial “muscles”, mimicking movements of 42 facial muscles controlled by neural networks.

Researchers then built AI that helps the robot read others’ emotions and adapt to the overall mood.

Researchers believe that such technologies could help create robots useful in the workplace, hospitals, schools and nursing homes in the future.

Enthusiast creates AI-powered universal effects pedal for electric guitar

Engineer and musician Kit Bloomer developed a homemade system that, powered by deep learning algorithms, copies virtually any existing electric guitar effects block in real time.

To do this he used a Raspberry Pi single-board computer, an audio interface, a low-latency OS for audio and a VST3 plugin.

The project source code and build instructions are available on GitHub. The developer also posted the training code — anyone can train the network to emulate the required audio effect.

Also on ForkLog:

Subscribe to ForkLog news on Telegram: ForkLog AI — all AI news!

Exit mobile version