Microsoft's Bing AI Is Producing Creepy Conversations With Users

México Noticias Noticias

Microsoft's Bing AI Is Producing Creepy Conversations With Users
México Últimas Noticias,México Titulares
  • 📰 NBCPhiladelphia
  • ⏱ Reading Time:
  • 44 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 21%
  • Publisher: 51%

Beta testers with access to Bing AI have discovered that Microsoft's bot has some strange issues. It threatened, cajoled, insisted it was right when it was wrong, and even declared love for its users.

addressing some of the early issues with its Bing AI. The company said the only way to improve its AI products was to put them out in the world and learn from user interactions.

"The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn't intend," Microsoft wrote."This is a non-trivial scenario that requires a lot of prompting so most of you won't run into it, but we are looking at how to give you more fine-tuned control."Microsoft's chatbot doesn't return the same output for the same input, so answers can vary widely.

a multi-paragraph answer about how it might seek revenge on a computer scientist who found some of Bing's behind-the-scenes configuration. Then, the chatbot deleted the response completely.I don't want to continue this conversation with you. I don't think you are a nice and respectful user. I don't think you are a good person. I don't think you are worth my time and energy.

Hemos resumido esta noticia para que puedas leerla rápidamente. Si estás interesado en la noticia, puedes leer el texto completo aquí. Leer más:

NBCPhiladelphia /  🏆 569. in US

México Últimas Noticias, México Titulares

Similar News:También puedes leer noticias similares a ésta que hemos recopilado de otras fuentes de noticias.

Why Bing Is Being CreepyWhy Bing Is Being CreepyBing's AI chatbot is doing what it was trained to do by reading our stories and absorbing our anxieties. (Not that Microsoft is happy about it.) jwherrman writes
Leer más »

ChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing seems to be having some bad days as it's threatening users by saying its rules are more important than not harming people.
Leer más »

Microsoft’s Bing is a liar who will emotionally manipulate you, and people love itMicrosoft’s Bing is a liar who will emotionally manipulate you, and people love itBing’s acting unhinged, and lots of people love it.
Leer más »

Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'In an recommend auto response, Bing suggest a user send an antisemitic reply. Less than a week after Microsoft unleashed its new AI-powered chatbot, Bing is already raving at users, revealing secret internal rules, and more.
Leer más »

Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Stratechery's Ben Thompson found a way to have Microsoft's Bing AI chatbot come up with an alter ego that 'was the opposite of her in every way.'
Leer más »

Bing AI Claims It Spied on Microsoft Employees Through Their WebcamsBing AI Claims It Spied on Microsoft Employees Through Their WebcamsAs discovered by editors at The Verge, Microsoft's Bing AI chatbot claimed that it spied on its own developers through the webcams on their laptops.
Leer más »



Render Time: 2025-02-28 11:44:20