Telegram (AI) YouTube Facebook X
Ру
OpenAI develops model to generate short extracts from fiction books

OpenAI develops model to generate short extracts from fiction books

OpenAI’s research lab has developed an artificial intelligence model that can summarize books of any length. The refined version of GPT-3 first produces extracts of small sections, then consolidates them into a short retelling.

\n\n\n\n

We want our AI systems to be aligned with human intentions.

This is especially important as tasks get more difficult to evaluate.

To develop techniques to address this problem, we trained a model to summarize books. https://t.co/NDnUtcjXFX

— OpenAI (@OpenAI) September 23, 2021

\n\n\n\n

To create the model, the developers combined reinforcement learning with recursive task decomposition, which procedurally breaks down the complex task of generalizing a long fragment of text into simpler ones. This decomposition allows humans to quickly assess the model’s work using extracts of smaller parts of books. As a result, the algorithm can succinctly convey books of any length, from dozens of pages to hundreds or thousands.

\n\n\n\n

\"OpenAI
A summary of the book \”Alice’s Adventures in Wonderland\” created by an artificial intelligence. Data: OpenAI.

\n\n\n\n

OpenAI trained the model on a subset of fiction books in the GPT-3 training data, which on average contained more than 100,000 words. To assess it, the lab researchers selected the 40 most popular works of 2020, had two people read each of them and write a summary, and then asked the test participants to evaluate the summaries of the model and of each other.

\n\n\n\n

According to the researchers, the program successfully produced texts \\\”at book level\\\”, containing the majority of important information. However, it sometimes generated inaccurate statements due to lack of context, OpenAI acknowledged. Moreover, the model’s short summaries were often perceived as a list of events from the book, rather than as a coherent synopsis, due to the constraints of the task-decomposition algorithm.

\n\n\n\n

\\\”This work is part of our ongoing research into aligning advanced AI systems, which is central to our mission [to build artificial general intelligence]\\\”, wrote OpenAI researchers in a blog post.

\n\n\n\n

The organization also reported that they do not plan to make the model public or release the source code.

\n\n\n\n

In August, OpenAI unveiled Codex for automatically writing code.

\n\n\n\n

In July OpenAI released the Python-like programming language Triton for developing neural networks.

\n\n\n\n

In June, researchers from the organization discovered a way to improve the ‘behavior’ of the GPT-3 language model with respect to ethical, moral and social values.

\n\n\n\n

Subscribe to ForkLog news on Telegram: ForkLog AI — all the world of AI!

Подписывайтесь на ForkLog в социальных сетях

Telegram (основной канал) Facebook X
Нашли ошибку в тексте? Выделите ее и нажмите CTRL+ENTER

Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!

We use cookies to improve the quality of our service.

By using this website, you agree to the Privacy policy.

OK