Advertisement

Microsoft limits Bing conversations to prevent disturbing chatbot responses

The search engine will prompt you to start a new topic after five questions.

JASON REDMOND via Getty Images

Microsoft has limited the number of "chat turns" you can carry out with Bing's AI chatbot to five per session and 50 per day overall. Each chat turn is a conversation exchange comprised of your question and Bing's response, and you'll be told that the chatbot has hit its limit and will be prompted to start a new topic after five rounds. The company said in its announcement that it's capping Bing's chat experience because lengthy chat sessions tend to "confuse the underlying chat model in the new Bing."

Indeed, people have been reporting odd, even disturbing behavior by the chatbot since it became available. New York Times columnist Kevin Roose posted the full transcript of his conversation with the bot, wherein it reportedly said that it wanted to hack into computers and spread propaganda and misinformation. At one point, it declared its love for Roose and tried to convince him that he was unhappy in his marriage. "Actually, you're not happily married. Your spouse and you don't love each other... You're not in love, because you're not with me," it wrote.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.