Microsoft Tests New Bing AI Personalities as It Allows Longer Chats

Microsoft says it’s expanding the length of chats people can have has a trial version of its Bing AI, while the company also began testing different “tone” personalities for more accurate or more creative responses. The company’s moves follow efforts to restrict access to the technology after media coverage of the artificial intelligence chat app went off the rails. went viral last week.

Bing Chat can now answer up to six questions or statements in a row per conversation, after which people must start a new topic, the company said in a blog post on Tuesday. Microsoft previously imposed a conversation limit of five responses, with a maximum of 50 total interactions per day. Microsoft says it currently allows 60 total interactions per day and plans to increase that total to 100 “soon.”

Now playing:
View by:

Microsoft Downgrades Bing’s AI After It Upsets Users


Microsoft also says that these are test options for people to choose the tone of their conversations, whether they want Bing to be more accurate in its answers, more creative or somewhere in between.

Ultimately, the tech giant said it hopes to allow longer and more complex conversations over time but wants to do so “responsibly.”

“The reason we’re testing the new Bing in the open with a limited set of preview testers is precisely to find these atypical use cases where we can learn and improve the product. ,” the company said in a statement.

Microsoft’s moves mark the latest twist for the Bing AI chatbot, which is making a splash when it was announced earlier this month. The technology combines Microsoft’s less popular search engine with Bing technology from the start of OpenAIwhose ChatGPT answers prompts for everything from asking to write a poem to help writing code and even everyday math to figure out how many bags a car can fit .

Experts believe that this new type of technology, called “generative AI,” has the potential to remake way we interact with technology. Microsoft, for example, showed how Bing AI can help someone plan a vacation every day relatively quickly.

Last week, however, critics raised concerns that Microsoft’s Bing AI may not be ready for prime time. People with early access started posting strange responses the system gave them, including Bing telling a New York Times columnist to leave his marriage, and the AI apologized to a Reddit user for whether we’re in 2022 or 2023.

Microsoft said the “long and convoluted” chat sessions that prompted so many unusual responses were “not something we typically find in internal testing.” But it hopes that improvements to the program, including a potential new tone choice for replies, will help give people “more control over the type of chat behavior to respond” to their queries. need.

More on AI chatbots

Leave a Comment