Telegram (AI) YouTube Facebook X
Ру
Researchers teach artificial intelligence to sleep

Researchers teach artificial intelligence to sleep

An international group of researchers has forced a spiking neural network to alternate training with ‘sleep’ to perform two different tasks without overwriting the connections learned in the first task. The story is reported by Motherboard.

“There is a strong trend now to use ideas from neurobiology and biology to improve the performance of machine learning algorithms. Sleep is one of them,” said Maxim Bazhenov, a co-author of the study and a researcher at the University of California, San Diego.

Artificial neural networks often reach superhuman levels. However, when it comes to continual learning or solving one task after another, they struggle to acquire new knowledge without losing old memories.

“After thorough training, AI finds it very hard to learn to perform a completely new operation. And if it does manage, the old memory might be damaged,” noted Pavel Sanda, a co-author of the study and a researcher at the Czech Academy of Sciences.

He explained that in neuroscience this action is called “catastrophic forgetting,” which can be solved only with memory consolidation— the process that helps convert recent short-term memories into long-term ones, often occurring during REM sleep.

According to the scientist, memory reorganization plays a big role in why humans need to sleep at all. If this process malfunctions or is interrupted, people can suffer serious cognitive impairments.

“This phenomenon can be observed in elderly people who recount childhood experiences in detail, but have difficulty remembering what they had for lunch yesterday,” noted Sanda.

Researchers used prior work in memory plasticity and sleep modelling. They employed a neural network to simulate sensory processing and reinforcement learning in an animal brain.

The researchers posed two separate tasks to the model, in which it learned to differentiate punishment from reward.

They then tested whether the AI would exhibit “catastrophic forgetting.” As a result, each training session on the second task erased the knowledge learned in the first.

But by forcing the algorithm to imitate biological sleep by activating artificial neurons in the form of noise, the researchers found progress. The rapid alternation between rest phases and training on the second task allegedly allows the model to “remember” how to perform the first.

“This is another good demonstration that simple principles can lead to effects that aren’t so simple,” said Sanda.

Earlier this year, a British startup created a machine-learning model for managing different types of vehicles.

In August 2021, DeepMind developed a universal architecture for processing all types of input and output data.

In February, Sonantic created an AI algorithm imitating coquettish speech patterns thanks to new “non-verbal sounds,” including sighs, pauses and light laughter.

Subscribe to ForkLog news on Telegram: ForkLog AI — all the news from the world of AI!

Подписывайтесь на ForkLog в социальных сетях

Telegram (основной канал) Facebook X
Нашли ошибку в тексте? Выделите ее и нажмите CTRL+ENTER

Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!

We use cookies to improve the quality of our service.

By using this website, you agree to the Privacy policy.

OK