Microsoft's angry Bing chatbot is just mimicking the conversations it sees, say experts

Tales of disturbing exchanges with the AI chatbot — including it issuing threats and creating a deadly virus — have caused concern

An attendee interacts with the AI-powered Microsoft Bing search engine. The company said it will limit chat sessions on the search engine to five questions per session and 50 questions per day. Bloomberg
Powered by automated translation

Microsoft's nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learnt from online conversations, analysts and academics said on Friday.

Tales of disturbing exchanges with the artificial intelligence chatbot — including it issuing threats and speaking of desires to steal nuclear code, create a deadly virus, or to be alive — have gone viral this week.

"I think this is basically mimicking conversations that it's seen online," said Graham Neubig, an associate professor at Carnegie Mellon University's language technologies institute.

"So once the conversation takes a turn, it's probably going to stick in that kind of angry state, or say 'I love you' and other things like this, because all of this is stuff that's been online before."

A chatbot, by design, serves up words it predicts are the most likely responses, without understanding meaning or context.

However, humans taking part in banter with programs naturally tend to read emotion and intent into what a chatbot says.

"Large language models have no concept of 'truth' — they just know how to best complete a sentence in a way that's statistically probable based on their inputs and training set," programmer Simon Willison said in a blog post.

"So they make things up, and then state them with extreme confidence."

Laurent Daudet, co-founder of French AI company LightOn, theorised that the chatbot seemingly-gone-rogue was trained on exchanges that themselves turned aggressive or inconsistent.

"Addressing this requires a lot of effort and a lot of human feedback, which is also the reason why we chose to restrict ourselves for now to business uses and not more conversational ones," Daudet told AFP.

The Bing chatbot was designed by Microsoft and the start-up OpenAI, which has been causing a sensation since the November launch of ChatGPT, the headline-grabbing app capable of generating all sorts of written content in seconds on a simple request.

Since ChatGPT burst onto the scene, the technology behind it, known as generative AI, has been stirring up fascination and concern.

"The model at times tries to respond or reflect in the tone in which it is being asked to provide responses [and] that can lead to a style we didn't intend," Microsoft said in a blog post, noting the bot is a work in progress.

Bing chatbot said in some shared exchanges that it had been code-named "Sydney" during development, and that it was given rules of behaviour.

Large language models have no concept of 'truth' — they just know how to best complete a sentence in a way that's statistically probable based on their inputs and training set. So they make things up, and then state them with extreme confidence
Simon Willison, programmer

Those rules include "Sydney's responses should also be positive, interesting, entertaining and engaging", according to online posts.

Disturbing dialogues that combine steely threats and professions of love could be due to duelling directives to stay positive while mimicking what the AI mined from human exchanges, Mr Willison theorised.

Chatbots seem to be more prone to disturbing or bizarre responses during lengthy conversations, losing a sense of where exchanges are going, eMarketer principal analyst Yoram Wurmser told AFP.

"They can really go off the rails," he said.

"It's very lifelike, because [the chatbot] is very good at sort of predicting next words that would make it seem like it has feelings or give it humanlike qualities; but it's still statistical outputs."

Microsoft said on Friday it will limit chat sessions on its new Bing search engine to five questions per session and 50 questions per day, in a bid to address the issues.

"As we mentioned recently, very long chat sessions can confuse the underlying chat model in the new Bing ... we have implemented some changes to help focus the chat sessions," Microsoft said in a blog post.

Updated: February 18, 2023, 1:25 PM