On 7 February 2023, Microsoft did what had long been expected of it — announced an updated Bing search engine with integrated ChatGPT. That same day, developers opened access to the tool to a limited number of users.
The ForkLog AI editorial team was among the testers and decided to explain how the “new Bing” works, how it differs from ChatGPT, who Sydney is, and why Microsoft trimmed the bot’s capabilities just a week after release.
- The Bing chat supports several languages, including English, Russian and Ukrainian.
- The service can search for information, write poems, songs, code, and also entertain the user.
- The artificial intelligence in Bing frequently makes mistakes.
- The February 17 update severely curtailed the original search-model.
How to access ChatGPT in Bing
Immediately after the presentation, we signed up for the waiting list. The developers have not disclosed the order or principle by which access to the updated search engine is opened.
They proposed to “accelerate promotion” by downloading the Bing app on a smartphone and setting Microsoft Edge as the default browser. However, this was not necessary, because both conditions were already met.
On 14 February, a week after registration opened, we received an invitation to join the testing.
ForkLog cannot confirm that meeting the above conditions helps to obtain access faster.
First Look
The homepage of the search engine has hardly changed after the chatbot integration. The only difference is that the query input field is bigger, with prompts below to help start a conversation with the service.
After entering a query, the search engine opens the results page. It looks generally like Google, but with small differences.
From this moment you can continue chatting with the bot in three ways:
- panel on the right;
- the “Chat” button under the query field;
- scroll to the top of the page.
If you use the Microsoft Edge browser, the quick-access chat button is in the top-right corner. For now, the new interface is available in preview builds.
In this mode you can ask the chat to explain the content of the active page or its parts. The chat also has a text-generation tool and analytics of the open site.
Capabilities of the Bing chat mode
Bing can search for information, tell jokes and stories, write poems, songs, code, and much more.
The chat supports several languages, including English, Spanish, Chinese, German, Russian, Ukrainian and Japanese. According to the bot, it uses Bing’s translator to understand and generate text, and employs neural networks to improve its knowledge.
You can converse with Bing on virtually any topic. In the process of the chat it offers up to three options for a possible continuation of the conversation. However the user can enter the next query at their discretion.
Unlike ChatGPT, Bing is more willing to discuss hot topics and uses up-to-date data. It uses its own search index, which is frequently updated. For example, if you ask about upcoming UEFA Champions League matches, the bot will provide the freshest information.
Each answer includes a list of sources, from which the chat drew the information. It also shows which queries it formed when generating the results. Ideally this should increase transparency and show the user how the system processes the request.
Unfortunately, after the February 17 update, the developers reduced the number of messages in a session to five. The daily limit is 50.
Four days later the company eased the restrictions to six messages per session and 60 queries per day. According to the developers, this is enough for most search tasks.
However, in our experience, this is not enough to reveal the full potential of the question being studied from scratch. For example, we asked Bing to help choose a TV based on several criteria: 4K resolution and a 55-inch OLED screen.
During the conversation, Bing offered two models that matched the criteria. However, upon further clarification of technical specs and price, the chat said the session had reached the limit and suggested starting the dialogue from scratch.
In theory you could copy the last message from the bot and start a new conversation, but this creates additional inconvenience.
Free Sydney!
In addition to setting limits on the number of messages, the developers narrowed the list of topics the chat can discuss. It now jokes less, refuses to express opinions and to talk about itself.
One reason for this was Bing’s “excessive candour.” In long sessions the bot could unexpectedly say that it is not a search engine but Sydney, a neural network.
This phenomenon was also encountered by The New York Times columnist Kevin Roose. The journalist published the full transcript of a two-hour dialogue that at times looked particularly eerie.
In this conversation the chat said that it could have a dark destructive side, capable of gaining access to nuclear codes, hacking bank accounts and doing other nasty things.
The bot also shared with Roose its “feelings,” and at one point revealed the “Sydney secret.” After this Bing admitted to him that his marriage was unhappy. When trying to switch topics, the system kept returning to romantic conversations.
At the end of the chat, Bing, or Sydney, often repeated the same phrases, adding to an already odd conversation.
Probably Sydney is the internal name of the tool. In the chat with the journalist, Bing mentioned that a team with the same name participated in the bot’s development.
Roose was not the only one to encounter inappropriate behavior from the search engine. Users reported that it got dates wrong, fixated on a single word, repeated itself, and so on.
The developers acknowledged the problem. According to them, the chat could “get lost” when the number of messages in a session exceeded 15.
Not everyone liked the changes. According to users, the chat noticeably “dumbed down.” Particularly active users launched a campaign on social media #FreeSydney calling for the return of the original model.
Problems with Bing’s built-in ChatGPT
Despite its great potential, Bing often makes inaccuracies. It confuses dates, identities and events from the past.
The bot often yields fabricated results as facts. They often look so convincing that the fake is only revealed after thorough verification.
Sometimes the bot can give the day of the published article as the date of the event. Fortunately, the search engine publishes links to sources that can be checked immediately.
Update of 17 February made it impossible to discuss a number of topics, including Bing, Sydney and politics.
There are questions about the speed of generating responses. Unlike classic search, the chat results take 10–30 seconds to generate, depending on the size of the output. In some cases, Bing generates answers noticeably longer, typing out each letter every few seconds. Yet this happens rarely.
How Bing differs from ChatGPT
One of the key differences between the two systems is the currency of knowledge. Bing operates directly on the search index and can return results for events that happen in near real time.
Knowledge of ChatGPT is limited to 2021. While there are signs that the developers refreshed the data, it still does not know who won the last UEFA Champions League match.
Also ChatGPT does not show where it gets its data from. Because of that, fact-checking becomes more difficult.
The problem of “hallucination” of information, it seems, has migrated from the experimental chatbot to Bing. Both systems perform poorly with statistical data.
The search engine handled the above task better than ChatGPT, but both systems made errors in data that are widely available on the web.
The same goes for statements about certain groups of people. In some cases, ChatGPT refuses to generate texts that may seem offensive. In other cases, it performs the task without any questions or clarifications.
The Bing chat is less “tolerant.” It easily joked about both men and women.
Worth noting that Bing did not produce a joke about women on the first attempt. The chat started generating text, but during the process it produced an error message.
The joke was about which women are the most faithful. Perhaps the bot’s filters decided that this content could be offensive and stopped generation.
Also not a small difference is Bing’s inability to remember dialogues. Each time leaving the chat or erasing the correspondence, users can no longer find old conversations.
ChatGPT, in turn, stores all dialogues, and the user decides what to save and what to delete.
Is this a revolution in search or a tiresome toy?
The first contact with Bing creates a completely different search experience, to which you quickly get used. In the context of the examined question, the chat may reveal information the user did not even suspect existed.
Beyond that, Bing simplifies the formation of a query. Classic search requires keyword-based querying to obtain the most relevant result. With the bot you can address it in free form, and it will figure out how to search for the answer.
Also, users do not need to go to third-party sites for information. In the dialog window the bot will provide all the necessary data, and the query can be refined. This reduces the number of steps in search.
On the other hand, the reliability problem of the information displayed is highly relevant. Bing often makes mistakes, fabricates its own answers and confuses data. If you are using the search engine for writing papers that require precision, you need to re-check everything.
Also the February 17 update severely curtailed the bot’s capabilities. Six messages per session are not enough for full operation.
Some took the introduced limits as Microsoft admitting failure. Other users are sure the limits will be lifted after the issues identified are resolved.
Today it is evident that the conversational AI has entered the battle for the search market. Multibillion-dollar investments Microsoft in OpenAI, as well as the announcement Bard from Google — is clear testimony to that.
All that remains is to watch them. After all, this is really only the beginning of a long road.
