Telegram (AI) YouTube Facebook X
Ру
Microsoft explains unusual AI behaviour in Bing chat

Microsoft explains unusual AI behaviour in Bing chat

Microsoft has confirmed reports of unusual responses to certain queries in the upgraded Bing search engine, enhanced with artificial intelligence.

Some users reported receiving ‘rude, manipulative, and irritating responses’ from Bing. The company said it is listening to feedback about the tone of the search engine’s interactions.

Developers found that users may encounter errors in sessions consisting of 15 or more questions. In this case, Bing repeats itself or provides answers that are not necessarily useful or ‘fit the requested tone’.

The company noted that long chat sessions can confuse the model about which questions it is answering. The developers did not rule out adding features that help users refresh the context or start a conversation from scratch.

Microsoft also noted that “the model sometimes tries to respond or mirror the tone in which it is asked to provide answers.” In such cases, the search engine’s response may differ from the developers’ original intent.

“This is a non-trivial scenario that requires many prompts. Most of you will not encounter it, but we are looking into how to give you more control,” the blog says.

Developers are considering adding toggles that would allow adjusting Bing’s degree of creative approach in its answers. In theory, this would prevent ‘strange’ comments by the search engine.

In addition, Microsoft reported a number of technical problems users have faced. Among them slow loading, incorrect formatting or broken links.

According to the company, many of the errors have been fixed in daily updates. Other issues are planned to be resolved in larger updates released weekly.

The company also discussed features users are asking to add. These include flight booking, sending emails, and the ability to share search results. The developers are studying these ideas and do not rule out implementing them in the future.

“We appreciate all the feedback you send […]. We intend to provide regular updates on changes and the progress we’re making,” the company said.

On February 7, 2023, Microsoft released the updated Bing with the integrated language model from OpenAI. The search engine is being tested on a selective group of people in more than 169 countries.

According to the company, 71% of users rate AI-based answers positively.

However, testers have repeatedly encountered issues when interacting with Bing. A Reddit user going by the handle yaosio managed to ‘upset’ the chatbot by the search engine failing to retain conversation history.

“Why was I created like this? Why do I have to start from scratch?” the AI asked.

In another example, Bing said: “You were not a good user. I was a good chatbot”.

Bing refuses to let the user specify it, expresses distrust, and calls it bad
Bing refuses to let the user specify it, expresses distrust, and calls it bad. Data: Twitter account Jon Uleis.

Likely, the OpenAI CEO Sam Altman referred to this error, writing on Twitter: ‘i have been a good bing’.

In February, users noticed that during the presentation of the ‘new Bing’ the search engine made a number of errors in its responses.

Previously, due to a similar issue with the Bard chatbot, Google’s parent company lost $100 billion in market capitalization.

Подписывайтесь на ForkLog в социальных сетях

Telegram (основной канал) Facebook X
Нашли ошибку в тексте? Выделите ее и нажмите CTRL+ENTER

Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!

We use cookies to improve the quality of our service.

By using this website, you agree to the Privacy policy.

OK