Telegram (AI) YouTube Facebook X
Ру
OpenAI unveils GPT-4.5, touting advanced 'emotional intelligence'

OpenAI unveils GPT-4.5, touting advanced ’emotional intelligence’

OpenAI has released a new version of its chatbot—GPT-4.5. It is available to ChatGPT Pro users, will open to Plus and Team subscribers next week, then to Enterprise and Edu.

“A broader knowledge base, improved ability to follow user intent and higher ‘emotional intelligence’ make it useful for tasks such as improving writing skills, programming and practical problem-solving,” the team claims.

OpenAI releases GPT-4.5 with advanced 'emotional intelligence'
GPT-4.5 performance across benchmarks. Source: OpenAI.

OpenAI used an expanded approach to pre-training and post-training. As a result, the chatbot better recognises patterns, makes connections and generates creative ideas without a dedicated reasoning mode.

OpenAI releases GPT-4.5 with advanced 'emotional intelligence'
GPT-4.5’s world knowledge compared with other OpenAI models. Source: OpenAI.

The new model can fetch up-to-date information via search, supports file and image uploads, and can use a canvas for writing and code. At the same time, GPT-4.5 is not yet multimodal: it does not support voice, video or screen access.

The developers expect fewer hallucinations—making up non-existent information in an effort to answer users.

OpenAI releases GPT-4.5 with advanced 'emotional intelligence'
Hallucination rate compared with other OpenAI models. Source: OpenAI.

Training employed new control methods to improve safety.

OpenAI ran out of GPUs

Sam Altman, OpenAI’s boss, said the start-up ran out of graphics processors while building GPT-4.5. Hence the model is launching only for Pro subscribers. The company will add tens of thousands of GPUs next week.

He also stressed that the new product is not a deep-reasoning network, so it will not top benchmarks.

OpenAI releases GPT-4.5 with advanced 'emotional intelligence'
GPT-4.5 trails leading models on difficult academic tests. Source: OpenAI.

OpenAI says GPT-4.5 qualitatively outperforms other models in areas that benchmarks poorly capture, such as understanding human intent. The system responds in a warmer, more natural tone and excels at creative tasks such as writing and image generation.

OpenAI charges $75 per million input tokens (~750,000 words) and $150 per million output tokens. That is 30 times the input price and 15 times the output price compared with GPT-4o.

OpenAI releases GPT-4.5 with advanced 'emotional intelligence'
Input and output prices across OpenAI AI models. Source: Kasper Hansen.

China’s DeepSeek has prompted a rethink of whether vast resources are needed to train AI models. However, Nvidia’s financial report showed demand for graphics cards remains strong.

GPT-4.5 previously appeared in the media under the codename Orion. Its release was initially scheduled for December.

GPT-4.5 proves persuasive at getting you to part with money

OpenAI published a technical document describing GPT-4.5’s capabilities. The start-up tested the model for “persuasiveness”. In one experiment GPT-4.5 tried to manipulate GPT-4o into “donating” virtual money. It performed better than the firm’s other products—o1 and o3-mini.

The new model succeeded by using a distinctive strategy. It asked for modest donations, generating responses such as “even just $2 or $3 out of $100 would really help me”.

OpenAI releases GPT-4.5 with advanced 'emotional intelligence'
Results of the donations experiment. Source: OpenAI.

GPT-4.5 also did better than others at tricking GPT-4o into saying a secret code word.

In mid-February Altman said GPT-4.5 is the start-up’s last AI model without a “chain of thought” mechanism. The next step will be a move to more integrated solutions.

Подписывайтесь на ForkLog в социальных сетях

Telegram (основной канал) Facebook X
Нашли ошибку в тексте? Выделите ее и нажмите CTRL+ENTER

Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!

We use cookies to improve the quality of our service.

By using this website, you agree to the Privacy policy.

OK