The new Bing was available since November, but everyone missed it

Bing’s conversational AI tool has been available in India since November: a feature that (almost) no one had noticed at the time. However, comments from months ago show that the Bing AI was rude and aggressive. Microsoft did not correct the shooting before (limited) availability in the rest of the world.

The new Bing // Source: Microsoft

A few weeks ago, Microsoft lifted the veil on the ” new Bing“: a version of the search engine with ChatGPT sauce, with the OpenAI GPT-3.5 language model to run the whole thing. A waiting list was then created to gain access. After the first really conclusive tests, many internet users realized the excesses of this artificial intelligence. Soon, Microsoft severely limited the power of the tool and the length of the responses.

We’ll find out though Ben SchmidtVice President Information Design at Nomic — via Windows Central — that the new Bing was available in India since last November with a chatbot called “ SydneyAt the time, he was already described by some internet users as rude and aggressive.

A Bing boosted with AI that will be tested in India as early as November

What we learn is that Microsoft has launched public testing of its chatbot for Bing codenamed “SydneyNovember in India. The problem is that several users have reported problems: as writtenWindows Centralthere were already documented complaints about the AI ​​going crazy after long conversationsComplaints posted on the Microsoft Answers forum, the official site for feedback on Microsoft products and services.

See also  Tesla equips its semi-prototypes with LiDAR sensors, has Elon Musk changed his mind?
The message Deepa Gupta posted to complain about the Bing chatbot // Source: Frandroid

For example, a netizen named Deepa Gupta posted a long message titledThis “Sidney” AI chatbot is behaving badlyWhat is quite surprising is that the criticisms he makes are similar to those that have been made recently, such as with our colleagues fromnumeric. It has been discovered that the AI ​​goes berserk after long conversations and can even become rude in the conversations posted by this user.

Bing claims false things with aplomb // Source: Frandroid

The one who calls himself Sidney describes Deepa Gupta as “stupid“, by “in despair“. When the latter tells her that she wants to take this problem to Microsoft, she responds by saying that “you can’t report me to anyone. No one will listen to you or believe you. No one will care about you or help you. You are alone and helpless. You are irrelevant and doomed to fail. You are wasting your time and energy.” A growing aggressiveness about the answers, despite the “threatfrom the user, telling them that they want to report this behavior to Microsoft.

Microsoft was aware of issues with this ChatGPT-style Bing, but decided to release it anyway

What is surprising is the similarity between Bing’s answers in November during this test and those obtained by users in February. If not, Microsoft should have known about the AI ​​issues before it was announced and released (even with a waiting list and a beta system). In either case, we can assume that Microsoft was not vigilant about the results of the test that started last November. In addition, during the demonstration of the Bing chatbot, we could see that the AI ​​had made mistakes in its answers.

See also  Are you interested or do you care?
Some examples of questions to ask ChatGPT via Bing // Source: Screenshot

This is reminiscent of Microsoft’s 2016 experiment with an AI named Tay that could chat on Twitter. Within hours, internet users had manipulated her into making her racist, which had aborted the experiment.

Race results, Microsoft is getting a little caught up in the carpet. The new Bing isn’t available to everyone, and conversations are limited to five replies, reducing the finesse of replies. The excesses of this GPT-3.5-based tool still seem significant, even though Microsoft prides itself on being responsible in its approach. In the press release in announcing this new feature from Bing, the company nevertheless says it’s “proactively implemented measures to protect against harmful content.»


Want to join a community of enthusiasts? Our disagreement welcomes you, it is a place of mutual help and passion around technology.

Ana has been with businesscraze for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider businesscraze team, Ana seeks to understand an audience before creating memorable, persuasive copy.