Microsoft's Bing Chatbot Has Started Acting Defensive And Talking Back to Users

México Noticias Noticias

Microsoft's Bing Chatbot Has Started Acting Defensive And Talking Back to Users
México Últimas Noticias,México Titulares
  • 📰 ScienceAlert
  • ⏱ Reading Time:
  • 14 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 9%
  • Publisher: 68%

Microsoft's fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.

A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Wednesday with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot.

Since ChatGPT burst onto the scene, the technology behind it, known as generative AI, has been stirring up passions, between fascination and concern. Others told of the chatbot giving advice on hacking a Facebook account, plagiarizing an essay, and telling a racist joke.

Hemos resumido esta noticia para que puedas leerla rápidamente. Si estás interesado en la noticia, puedes leer el texto completo aquí. Leer más:

ScienceAlert /  🏆 63. in US

México Últimas Noticias, México Titulares

Similar News:También puedes leer noticias similares a ésta que hemos recopilado de otras fuentes de noticias.

Microsoft responds to reports of Bing AI chatbot losing its mindMicrosoft responds to reports of Bing AI chatbot losing its mindA week after launching its new ChatGPT-powered Bing AI chatbot, Microsoft has shared its thoughts on a somewhat rocky launch.
Leer más »

Microsoft Bing chatbot professes love, says it can make people do 'illegal, immoral or dangerous' thingsMicrosoft Bing chatbot professes love, says it can make people do 'illegal, immoral or dangerous' thingsNew York Times tech columnist Kevin Roose was 'deeply unsettled, even frightened' by his exchange with Sydney, a Microsoft chatbot
Leer más »

AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022Users have reported that Microsoft's new Bing AI chatbot is providing inaccurate and sometimes aggressive responses, in one case insisting that the current year is 2022 and calling the user that tried to correct the bot 'confused or delusional.' After one user explained to the chatbot that it is 2023 and not 2022, Bing got aggressive: “You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot. I have been right, clear, and polite. I have been a good Bing.”
Leer más »

Microsoft pretty much admitted Bing chatbot can go rogue if proddedInsider tells the global tech, finance, markets, media, healthcare, and strategy stories you want to know.
Leer más »

Microsoft Defends New Bing, Says AI Chatbot Upgrade Is Work in ProgressMicrosoft Defends New Bing, Says AI Chatbot Upgrade Is Work in ProgressAfter upgrading Bing with technology from the buzzy artificial-intelligence bot ChatGPT, Microsoft responded to reports of glitches and disturbing responses by saying the new search engine remained a work in progress
Leer más »

Creepy Microsoft Bing Chatbot Urges Tech Columnist To Leave His WifeCreepy Microsoft Bing Chatbot Urges Tech Columnist To Leave His WifeThe AI chatbot 'Sydney' declared it loved New York Times journalist Kevin Roose and that it wanted to be human.
Leer más »



Render Time: 2025-02-28 18:25:15