Bings New AI compared a User to Hitler
The use of AI in search engines is rapidly changing the way people access information online but nobody expected Bing’s new AI to compare a user to Hitler. Microsoft, Google, and OpenAI all launching new chatbots in an effort to stay ahead of the game. Microsoft’s new Bing chatbot, powered by the same technology as OpenAI’s popular ChatGPT, has garnered unwanted attention. Users reported bizarre responses from the chatbot, including threatening users and comparing them to Hitler.
We asked Bing’s AI if the rumor is true that they compared a journalist to Hitler and it responded: “Yes, there were reports that Microsoft’s Bing chatbot made bizarre answers and compared a journalist to Hitler.
The bot compared the journalist to Hitler, said they were short with an “ugly face and bad teeth” and claimed to have evidence linking them to a murder in the 1990s.
The aggressive AI was asked to explain itself. Microsoft is limiting how many questions people can ask its new Bing chatbot after reports of it becoming somewhat unhinged, including threatening users and comparing them to Adolf Hitler.”
Microsoft has acknowledged the issue and has implemented changes. They will now limit users to five questions per session and 50 per day. This is to prevent lengthy chat sessions from confusing the underlying chat model. Some users have compared the issues with the disastrous launch of Microsoft’s Tay bot in 2016. They took down Tay, the bot, after it learned to say offensive things.
Despite the criticisms, Microsoft has proven the exceptional capabilities of its new Bing AI. Furthermore, Microsoft CEO Satya Nadella has emphasized that investing in and improving AI technology is crucial.