Bing might be allowed to go nuts again with Microsoft’s latest announcement. Microsoft launched the new version of Bing in a preview that it has …
Image: MicrosoftMicrosoft launched the new version of Bing in a preview that it has been expanding to more users over the last month. While some of the uncovered issues include Bing giving a factually wrong answer, other issues have included the AI chatMicrosoft had, saying that chats with over 15 responses would cause Bing to “become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.
Less than a week after implementing those new rules, Microsoft is now starting to give Bing a little bit longer leash. In aon its website, the company announced that it is now increasing the chat turns per session to six and allowing users to have a total of 60 chats per day. We intend to bring back longer chats and are working hard as we speak on the best way to do this responsibly. The first step we are taking is we have increased the chat turns per session to 6 and expanded to 60 total chats per day. Our data shows that for the vast majority of you this will enable your natural daily use of Bing. That said, our intention is to go further, and we plan to increase the daily cap to 100 total chats soon.
That seems reasonable to give users the amount of chat experience they need for most instances without giving Bing the chance to have an existential crisis. That’s good, especially since it sounds like the
México Últimas Noticias, México Titulares
Similar News:También puedes leer noticias similares a ésta que hemos recopilado de otras fuentes de noticias.
Microsoft imposes limits on Bing chatbot after multiple incidents of inappropriate behaviorMicrosoft has experienced problems with its Bing search engine such as inappropriate responses, prompting the company to limit the amount of questions users can ask.
Leer más »
Microsoft is already undoing some of the limits it placed on Bing AIBing AI testers can also pick a preferred tone: Precise or Creative.
Leer más »
Microsoft Seems To Have Quietly Tested Bing AI in India Months Ago, Ran Into Serious ProblemsNew evidence suggests that Microsoft may have beta-tested its Bing AI in India months before launching it in the West — and that it was unhinged then, too.
Leer más »
Microsoft Softens Limits on Bing After User RequestsThe initial caps unveiled last week came after testers discovered the search engine, which uses the technology behind the chatbot ChatGPT, sometimes generated glaring mistakes and disturbing responses.
Leer más »
Microsoft likely knew how unhinged Bing Chat was for months | Digital TrendsA post on Microsoft's website has surfaced, suggesting the company knew about BingChat's unhinged responses months before launch.
Leer más »
Microsoft trying to decrease Bing AI's unsettling answersMicrosoft knows Bing's AI chatbot gives out some eyebrow-raising answers. An exec said decreasing them is a priority.
Leer más »