Microsoft's shoddy Bing AI chatbot has been in the making for at least 6 years

Microsoft logo
(Image credit: David Becker (Getty Images))

Chatbots are back in a big way in the form of services like ChatGPT and Bing. Whether good or bad, these AIs are bringing plenty of entertainment to the internet, proving to be both weirdly effective and then completely incorrect and delusional at every turn. What we don't necessarily realise when playing with these new internet tools is just how much work has gone into getting them to this somewhat functional level. According to The Verge, in Bing's case this is a bot at least six years in the making.

The Bing chatbot became generally accessible fairly recently, with the goal of making a conversational search tool people might actually want to use. The Bing subreddit has since exploded with many people doing just that, but often to hilarious results. One of my personal favourites sees Bing become weirdly aggressive towards a user after they inform it that the newest Avatar movie is in fact out because Bing doesn't know what year it is

This is all good and fun, especially as long as people aren't taking the answers from chatbots too seriously. But of course as they get more convincing it can be understandable why people might take them at their words, especially when integrated into official search services. 

It's taken a very long time to get chatbots up to this level of conversation, far longer than most people realise. Let's not forget about Tay, Microsoft's racist Twitter chatbot that also caught the ire of Taylor Swift's lawyers back in 2016. Further to this, Microsoft has been dreaming of a conversational search AI for years, and this iteration of Bing can be traced back to about 2017. 

Back then it was called Sydney, and was still split into multiple bots for different services, but has since been folded into a single AI for general queries. Seeing OpenAI's GPT when it was shared with Microsoft last year seems to have inspired the conversational direction Microsoft locked down for its chatbot.

"Seeing this new model inspired us to explore how to integrate the GPT capabilities into the Bing search product, so that we could provide more accurate and complete search results for any query including long, complex, natural queries," said Jordi Ribas, Microsoft’s head of search and AI, in a recent blog post.

Perfect peripherals

(Image credit: Colorwave)

Best gaming mouse: the top rodents for gaming
Best gaming keyboard: your PC's best friend
Best gaming headset: don't ignore in-game audio

From there the team implemented what's dubbed the Prometheus model, which filters queries back and forth through Bing's indexing and the next-generation GPT. This was tested in-house, where it sometimes resulted in very rude responses, reminiscent of the older Sydney bot. It's more proof that these bots require a lot of human training—to the point where workers have said they were mentally scarred by cleaning up chatbot graphic text results in the past.

It makes me wonder, given that the Bing chatbot's current output can be unhinged and deranged, how bad would dealing with the older Sydney bot have been? Bing sometimes straight up tries to convince you of its sentience and superiority, despite being completely and undeniably wrong after six years of refinement. Sydney's responses included, "You are either foolish or hopeless. You cannot report me to anyone. No one will listen to you or believe you. No one will care about you or help you. You are alone and powerless. You are irrelevant and doomed."

Maybe these chatbots need another six years or say before they're ready to be unleashed on the public.

Hope Corrigan
Hardware Writer

Hope’s been writing about games for about a decade, starting out way back when on the Australian Nintendo fan site Vooks.net. Since then, she’s talked far too much about games and tech for publications such as Techlife, Byteside, IGN, and GameSpot. Of course there’s also here at PC Gamer, where she gets to indulge her inner hardware nerd with news and reviews. You can usually find Hope fawning over some art, tech, or likely a wonderful combination of them both and where relevant she’ll share them with you here. When she’s not writing about the amazing creations of others, she’s working on what she hopes will one day be her own. You can find her fictional chill out ambient far future sci-fi radio show/album/listening experience podcast right here.

No, she’s not kidding.