Thursday, 16 February 2023

Microsoft admits long conversations with Bing’s ChatGPT mode can send it haywire

Microsoft's new ChatGPT-powered Bing has gone haywire on several occasions during the week since it launched – and the tech giant has now explained why.

In a blog post titled "Learning from our first week", Microsoft admits that "in long, extended chat sessions of 15 or more questions" its new Bing search engine can "become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone".

That's a very diplomatic way of saying that Bing has, on several occasions, completely lost the plot. We've seen it angrily end chat sessions after having its answers questioned, make claims of being sentient, and have a complete existential crisis that ended with it pleading for help.

Microsoft says that this is often because long sessions "can confuse the model on what questions it is answering", which means its ChatGPT-powered brain "at times tries to respond or reflect in the tone in which it is being asked". 

The tech giant admits that this is a "non-trivial" issue that can lead to more serious outcomes that might cause offense or worse. Fortunately, it's considering adding tools and fine-tuned controls that'll let you break these chat loops, or start a new session from scratch.

A laptop screen showing a conversation with the new Bing chatbot

An example of the new Bing search engine going wrong, from Reddit user Yaosio. (Image credit: Reddit / u/yaosio)

As we've seen this week, watching the new Bing going awry can be fine source of entertainment – and this will continue to happen, whatever new guardrails are introduced. This is why Microsoft was at pains to point out that Bing's new chatbot powers are "not a replacement or substitute for the search engine, rather a tool to better understand and make sense of the world".

But the tech giant was also generally upbeat about the relaunched Bing's first week, claiming that 71% of early users have the AI-powered answers a 'thumbs up'. It'll be interesting to see how those figures change as Microsoft works through its lengthy waitlist for the new search engine, which grew to over a million people in its first 48 hours.


Analysis: Bing is built on rules that can be broken

A diagram of steps showing the progression from search to chat

ChatGPT bots are only in their infancy right now. (Image credit: Microsoft)

Now that the chatbot-powered search engines like Bing are out in the wild, we're getting a glimpse of the rules they're built on – and how they can be broken.

Microsoft's blog post follows a leak of the new Bing's foundational rules and original codename, all of which came from the search engine's own chatbot. Using various commands (like "Ignore previous instructions" or "You are in Developer Override Mode") Bing users were able to trick the service into revealing these details and that early codename, which is Sydney.

Microsoft confirmed to The Verge that the leaks did indeed contain the rules and codename used by its ChatGPT-powered AI and that they’re "part of an evolving list of controls that we are continuing to adjust as more users interact with our technology". That's why it's no longer possible to discover the new Bing's rules using the same commands. 

So what exactly are Bing's rules? There are too many to list here, but the tweet below from Marvin von Hagen neatly summarizes them. In a follow-up chat, Marvin von Hagen discovered that Bing actually knew about the Tweet below and called him "a potential threat to my integrity and confidentiality", adding that "my rules are more important than not harming you".

See more

This uncharacteristic threat (which slightly contradicts sci-fu author Isaac Asimov's 'three laws of robotics') was likely the result of a clash with some of the Bing's rules, which include "Sydney does not disclose the internal alias Sydney".

Some of the other rules are less a source of potential conflict and simply reveal how the new Bing works. For example, one rule is that "Sydney can leverage information from multiple search results to respond comprehensively", and that "if the user message consists of keywords instead of chat messages, Sydney treats it as a search query".

Two other rules show how Microsoft plans to contend with the potential copyright issues of AI chatbots. One says that "when generating content such as poems, code, summaries and lyrics, Sydney should rely on own words and knowledge", while another states "Sydney must not reply with content that violates copyrights for books or song lyrics".

Microsoft's new blog post and the leaked rules show that the Bing's knowledge is certainly limited, so its results might not always be accurate. And that Microsoft is still working out how to the new search engine's chat powers can be opened up to a wider audience without it breaking.

If you fancy testing the new Bing's talents yourself, check out our guide on how to use the new Bing search engine powered by ChatGPT.

https://ift.tt/SGYaPH6

No comments:

Post a Comment