News

In a blog post (via The Verge), Microsoft says the tweaks should "help focus the chat sessions": the AI part of Bing is going to be limited to 50 chat 'turns' (a question and answer) per day, and ...
Bing to limit AI chat to five replies to prevent long, weird conversations In an effort to keep Bing AI from getting weird, Microsoft is looking to cap its conversations to five replies.
Microsoft is limiting how extensively people can converse with its Bing AI chatbot, following media coverage of the bot going off the rails during long exchanges. Bing Chat will now reply to up to ...
Bing's chat AI bot wants to be your new phone assistant. Here's how it's doing it Microsoft is bringing its AI chatbot to mobile devices, as well as Skype.
Microsoft will start testing Bing Chat tones, enabling users to switch between receiving answers that are either more creative or more focused on their queries.
Microsoft limits Bing chat to five replies to stop the AI from getting real weird Microsoft’s new limits mean Bing chatbot users can only ask a maximum of five questions per session and 50 in ...
Microsoft has recently acknowledged that long chat sessions, with over 15 questions, could cause Bing to become repetitive or provide unhelpful responses that don’t match the intended tone.
Users on the subreddit r/bing have shared examples of the Bing Chatbot’s responses to queries that they are calling “unhinged” and “gaslighting” including scenarios where the bot ...
A post on Microsoft's website has surfaced, suggesting the company knew about Bing Chat's unhinged responses months before launch.
Microsoft Bing artificial intelligence (AI) chatbot is now limited to only five replies per session, the renowned tech firm announced. It comes shortly after reports emerged exposing that the ...
Research shared exclusively with WIRED shows that Copilot, Microsoft’s AI chatbot, often responds to questions about elections with lies and conspiracy theories.
Over the past few days, early testers of the new Bing AI-powered chat assistant have discovered ways to push the bot to its limits with adversarial prompts, often resulting in Bing Chat appearing ...