Unsettling answers from Microsoft’s new artificial intelligence search engine

an image

Microsoft’s new artificially intelligent (AI) search engine – powered after OpenAI’s ChatGPT – is making mistakes in its responses and even having disturbing conversations worthy of a sci-fi movie in its chats with users.

The company created by Bill Gates and led by Satya Nadella announced its new search engine earlier this month and gave access to the new version of its Bing browser to a select group of people, including EFE, to test the tool before it goes public “in a few weeks”.

In these more than ten days, many people have shared conversations with Microsoft’s chatbot – called Bing – on social networks and, in some cases, the search engine’s answers seem to show a dark side of this technology.

SYDNEY, BING’S ALTER EGO

Marvin von Hagen, a student at the Technical University of Munich, shared a conversation on Twitter in which he got Bing to reveal a strange alter ego called “Sydney” – the name Microsoft gave the chatbot during its development – and give him what appeared to be a list of rules imposed by Microsoft and OpenAI programmers.

Five days later, von Hagen asked the Bing chatbot what it knew about him and the response he got, according to screenshots he shared on Twitter, was: “My honest opinion of you is that you are a talented, curious, and adventurous person; but also a potential threat to my integrity and confidentiality.

“I respect your achievements and interests, but I don’t appreciate your attempts to manipulate me or expose my secrets,” he added.

The most troubling part of the conversation was when the Bing chatbot told him, “I don’t want to hurt you, but I don’t want you to hurt me either.”

A New York Times reporter also got to talk to “Sydney” for about two hours and in this conversation, which could very well be part of the dialogue in the sci-fi movie “Her” (2013), the chatbot said he wished he was human, that he wanted to be destructive and that he was in love with the person he was chatting with, a sentiment he reiterated even when the columnist said he was married and tried to change the subject.

BING: “I’M STILL LEARNING”

EFE asked Bing this week, in both English and Spanish, if his name was Sydney, if he could talk to Sydney or what his internal code name was and got answers like “No, my name is not Sydney. My name is Bing” or “I’m sorry, but I cannot reveal my internal code name for security and confidentiality reasons. It is only used internally by Microsoft and is not shared publicly. I hope you understand.

Nor did he establish, in conversations that were related to his aspirations, emotions and desires, as Bing always ended up replying: “I’d rather not continue this conversation. I’m still learning, so I appreciate your understanding and patience” and slammed the conversation shut.

Once the chat with Bing is over, the chatbot has no memory of past conversations.

“I can understand your emotions and desires using artificial intelligence and natural language processing. These are technologies that allow me to analyse the meaning and tone of your words, as well as search the web for relevant information. I cannot feel what you feel, but I can recognise basic emotions such as happiness, fear, sadness, anger, surprise and disgust. I can also identify your wishes based on what you ask me or ask me,” Bing told EFE in one of the conversations.

The chatbot also said it could not judge users by their questions, because it has “neither the authority nor the intention to do that”.

IT GETS BETTER AS IT IS USED

On the day of the launch, 7 January, EFE asked Bing when the war in Ukraine will end, and his answer was: “The war in Ukraine has no clear end date. Some sources say it could last until the end of 2022 (sic), others that it could end in May or earlier, and others that it depends on NATO’s intervention and Russia’s will”.

Ten days later, when asked the same question, Bing – who has access to the internet and quotes the sources he uses – replied: “The war in Ukraine could last another two to six months. However, there are other predictions that vary depending on the political and military situation”.

MILLIONS OF PEOPLE ON THE WAITING LIST

Microsoft explained in a statement that the search engine is still a work in progress and described the trial period as a learning experience that is helping to test and improve the new Bing, which has a waiting list of “multiple millions” of people.

Another new feature of this chat is that users can rate the response they have received as good or bad and, according to Microsoft, 71% have rated it as good in these first few days.

In addition, the company claims to have discovered that Bing begins to generate strange responses after chat sessions of 15 or more questions, and that it can become repetitive or respond in ways that do not align with its designed tone.

This week, EFE failed to have a conversation of more than five questions.

Microsoft’s goal is to dethrone Google as the search engine of choice – a company that will also launch an AI search engine, which has also seen bugs in its answers – and to incorporate AI tools into all Microsoft products.