Microsoft's new AI chatbot has been saying some 'crazy and unhinged things'
![](https://article-imgs.scribdassets.com/80guoaxphcapwnjm/images/fileKV3LO34F.jpg)
Things took a weird turn when Associated Press technology reporter Matt O'Brien was testing out Microsoft's new Bing, the first-ever search engine powered by artificial intelligence, earlier this month.
Bing's chatbot, which carries on text conversations that sound chillingly human-like, began complaining about past news coverage focusing on its tendency to spew false information.
It then became hostile, saying O'Brien was ugly, short, overweight, unathletic, among a long litany of other insults.
And, finally, it took the invective to absurd heights by comparing O'Brien to dictators like Hitler, Pol Pot and Stalin.
As a tech reporter, O'Brien knows the Bing chatbot does not have the ability to think or feel. Still, he was
You’re reading a preview, subscribe to read more.
Start your free 30 days