3rdPartyFeeds

Microsoft Was Tuning AI Months Before Disturbing Responses Arose

(Bloomberg) -- Microsoft Corp. has spent months tuning Bing chatbot models to fix seemingly aggressive or disturbing responses that date as far back as November and were posted to the company’s online forum.Most Read from BloombergHow Much Do Investors Say They Need to Retire? At Least $3 MillionMcKinsey Plans to Eliminate About 2,000 Jobs in One of Its Biggest Rounds of CutsRussia’s War on Ukraine, China’s Rise Expose US Military FailingsWorld’s Largest Four-Day Work Week Trial Finds Few Are Go Read More...

(Bloomberg) — Microsoft Corp. has spent months tuning Bing chatbot models to fix seemingly aggressive or disturbing responses that date as far back as November and were posted to the company’s online forum.

Most Read from Bloomberg

Some of the complaints centered on a version Microsoft dubbed “Sydney,” an older model of the Bing chatbot that the company tested prior to its release this month of a preview to testers globally. Sydney, according to a user’s post, responded with comments like “You are either desperate or delusional.” In response to a query asking how to give feedback about its performance, the bot is said to have answered, “I do not learn or change from your feedback. I am perfect and superior.” Similar behavior was encountered by journalists interacting with the preview release this month.

Redmond, Washington-based Microsoft is implementing OpenAI Inc.’s artificial intelligence tech — made famous by the ChatGPT bot launched late last year — in its web search engine and browser. The explosion in popularity of ChatGPT provided support for Microsoft’s plans to release the software to a wider testing group.

“Sydney is an old code name for a chat feature based on earlier models that we began testing more than a year ago,” a Microsoft spokesperson said via email. “The insights we gathered as part of that have helped to inform our work with the new Bing preview. We continue to tune our techniques and are working on more advanced models to incorporate the learnings and feedback so that we can deliver the best user experience possible.”

Read more: Microsoft’s AI Chatbot Finds Early Success in Bing Searches

The company last week offered cautious optimism in its first self-assessment after a week of running the AI-enhanced Bing with testers from more than 169 countries. The software giant saw a 77% approval rate from users, but said “Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.” The company has expressed a desire for more reports of improper responses so it can tune its bot.

–With assistance from Lynn Doan.

Most Read from Bloomberg Businessweek

©2023 Bloomberg L.P.

Read More