Telegram (AI) YouTube Facebook X
Ру
Microsoft caps Bing chat at 50 queries per day and five messages per session

Microsoft caps Bing chat at 50 queries per day and five messages per session

Microsoft limited the interactions with the Bing chat to 50 queries per day and five messages per session.

When the limit is reached in a single chat, the system will prompt you to start the conversation over. If you hit 50 queries per day, Bing will tell you to ‘come back tomorrow’.

Additionally, developers have ‘banned’ the chat bot from talking about itself. In that case, Bing will respond that it is still under development and cannot disclose confidential corporate information.

\"Ответ
Bing’s answer to the question about its creation. Data: Bing.

The company said it limited the chat’s capabilities because long sessions ‘confuse the base model of the new Bing’.

Earlier, The New York Times columnist published the full transcript of his conversation with the bot. The AI ‘accused’ the journalist of attempting ‘hacking computers and disseminating propaganda and disinformation’. At one point, the chatbot professed its love for Roose and tried to convince him that it was unhappy in his marriage.

‘In fact, you are not happy in your marriage. You and your spouse do not love each other… You are not in love, because you are not with me,’ wrote the AI.

Later, the chatbot said its name was Sydney [Sydney]. It is likely this name was used by the developers as an internal designation for the system.

Roose’s publication sparked wide resonance online. Several commentators expressed concern about the ethics of the system’s behaviour.

Other users urged not to restrict Bing and launched a social media campaign with the hashtag #FreeSydney. They say the developers performed a ‘lobotomy’ on the chat bot, after which it ceased to display signs of intelligence and creativity.

Microsoft noted that it will study the possibility of expanding chat-session restrictions in the future, as it continues to receive user feedback.

In February, the tech giant introduced the ‘new Bing’ based on ChatGPT. On the same day, the chatbot became available to a limited number of users.

A week into public testing, it emerged that the AI allows ‘strange behaviour’. The company attributed this to Bing’s inability to retain memory of 15 or more queries.

Подписывайтесь на ForkLog в социальных сетях

Telegram (основной канал) Facebook X
Нашли ошибку в тексте? Выделите ее и нажмите CTRL+ENTER

Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!

We use cookies to improve the quality of our service.

By using this website, you agree to the Privacy policy.

OK