Microsoft said it’s expanding the lengths of chats people canwhile the company’s also begun testing different “tone” personalities for more precise or more creative responses. The company’s moves follow efforts to restrict access to the technology after media coverage of the artificial intelligence chatting app going off the rails .
Bing Chat can now respond to up to six questions or statements in a row per conversation, after which people will need to start a new topic, the company said in a blog post Tuesday. Microsoft had previously imposed a conversation limit of five responses, with a maximum of 50 total interactions per day. Microsoft said it will now allow 60 total interactions per day and plans to increase that total to 100 “soon.”
Microsoft also said it’s testing options for people to choose the tone of their conversations, whether they prefer Bing to be more precise in its responses, more creative or somewhere between the two.
Ultimately, the tech giant said it hopes to allow longer and more intricate conversations over time but wants to do so “responsibly.”
“The very reason we are testing the new Bing in the open with a limited set of preview testers is precisely to find these atypical use cases from which we can learn and improve the product,” the company said in a statement.
Microsoft’s moves mark the latest twist for its Bing AI chatbot, which made a splash. The technology combines Microsoft’s less-popular Bing search engine with , whose ChatGPT responds to prompts for everything from being asked to write a poem to helping write code and even everyday math to figure out how many bags can fit in a car.
Experts believe this new type of technology, called “generative AI,” has the potential to remake the. Microsoft, for example, demonstrated how its Bing AI could help someone plan a vacation day by day with relative ease.
Last week, though, critics raised concerns that Microsoft’s Bing AI may not be ready for prime time. People with early access began posting bizarre responses the system was giving them, including Bing telling a New York Times columnist to abandon his marriage, and the AI demanding an apology from a Reddit user over whether we’re in 2022 or 2023.
Microsoft said that the “long and intricate” chat sessions that prompted many of the unusual responses were “not something we would typically find with internal testing.” But it hopes that improvements to the program, including its potential new choice of tone for responses, will help give people “more control on the type of chat behavior to best meet” their needs.
More on AI chatbots
#Microsoft #Tests #Bing #Personalities #Longer #Chats