New York () – AI chatbots have been pitched as productivity tools for consumers: They can help you plan a trip, for example, or advise you on how to write a difficult email to your landlord. But they often sound far-fetched, strangely opinionated, or just plain weird.
And despite the proliferation of chatbots and other artificial intelligence (AI) tools, many people still find it difficult to trust them and do not necessarily want to use them on a daily basis.
Now, Microsoft is trying to fix that, focusing on the “personality” of its chatbot and how it makes users feel, not just what it can do for them.
Microsoft on Tuesday announced a major update to Copilot, its AI system, which it says marks the first step toward creating an “AI companion” for users.
The updated Copilot has new capabilities, such as real-time voice interactions and the ability to interpret images and text on users’ screens. Microsoft also says it is one of the fastest AI models on the market. But the most important innovation, according to the company, is that the chatbot will now interact with users with a “warm tone and distinctive style, providing not only information but also encouragement, feedback and advice as you navigate the everyday challenges of life.” life”.
The changes could help Microsoft’s Copilot stand out in a growing sea of general-purpose AI chatbots. When Microsoft launched Copilot, then called Bing, early last year, it was considered a leader among its Big Tech peers in the AI arms race. But in the 18 months since then, its competitors have left it behind with new features, such as bots that can hold voice conversations, and easy-to-access (if imperfect) AI integrations with tools people already use regularly, such as the Google Search. With the update, Copilot is catching up on some of those capabilities.
When I tried out the new Copilot Voice feature at Microsoft’s launch event on Tuesday, I asked for advice on how to help a friend who’s about to have her first baby. The robot responded with practical advice, such as providing meals and running errands, but also gave more delicate advice.
“What good news!” said the tool in an optimistic male voice – Copilot is designed to subtly reflect the tone of users – that the company calls Canyon. “Being there for her emotionally is an important thing. Listen to her, calm her down and be her cheerleader… Don’t forget to celebrate this moment with her.”
The Copilot update reflects Microsoft’s vision for how everyday people will use AI as the technology develops. Microsoft AI CEO Mustafa Suleyman argues that people need artificial intelligence to be more than just a productivity tool, they need it to be a kind of digital friend.
I think in the future, the first thing you’ll think is, ‘Hey, Copilot,'” Suleyman told in an interview ahead of Tuesday’s announcement.
“You’re going to ask your AI partner to remember it, or to buy it, or to reserve it, or to help me plan it, or to show it to me… It’s going to be an injection of confidence, it’s going to be there to support you, he’s going to be your ‘hype man’, you know? It will be present on many, many platforms, like all your devices, in your car, in your home, and it will really start to live life by your side.”
The first version of Microsoft’s AI chatbot received some criticism for its unexpected changes in tone and, at times, downright concerning responses. The bot would start an interaction by sounding empathetic, but could become insolent or rude during long exchanges. On one occasion, the bot said to a reporter from The New York Times that he had to leave his wife because “I just want to love you and for you to love me.” (Microsoft later limited the number of messages that users can exchange with the chatbot in a session, to prevent these types of responses.)
Some experts have also raised broader concerns about the possibility of people forming emotional bonds with bots that seem too human at the expense of their real-world relationships.
To address these concerns while still developing Copilot’s personality, Microsoft has a team of dozens of creative directors, language specialists, psychologists, and other non-technical workers who interact with the model and provide it with feedback on ideal ways to respond.
“We’ve actually created an AI model designed for conversation, so it’s more fluid and friendly,” Suleyman told . “It has, you know, real energy… It has character. He occasionally takes it back, it can be a little funny, and it’s really optimized for this long-term conversational exchange, rather than a question-answer thing.”
Suleyman added that if you tell the new Copilot that you love him and would like to get married, “he’ll know that’s not something he should talk to you about. He will remind you, politely and respectfully, that that is not what he is here for.”
And to avoid the kind of criticism that dogged OpenAI over a chatbot voice that resembled that of actress Scarlett Johansson, Microsoft paid voice actors to provide training data for four voice options intentionally designed not to imitate well-known characters.
“Imitation is confusing. These things are not human and should not try to be,” Suleyman said. “They should give us enough of a sense that it’s comfortable and fun and familiar to talk to, while still being separate and distant… that boundary is how we form trust.”
Building on the voice function, the new Copilot will have a “daily” function that will read users the weather and a summary of news updated each day, thanks to partnerships with news outlets such as Reuters, Financial Times and others.
Microsoft also integrated Copilot into its Microsoft Edge browser: when users need a question answered or text translated, they can type @copilot in the address bar to chat with the tool.
Advanced users who want to experiment with features still in development will have access to what Microsoft calls “Copilot Labs.” They’ll be able to try out new features like “Think Deeper,” which the company says can reason through more complex questions, and “Copilot Vision,” which can see what’s on your computer screen and answer questions or suggest next steps.
Following some backlash over the privacy risks of a similar AI tool it launched for Windows earlier this year, called Recall, Microsoft says that Copilot Vision sessions are entirely voluntary and that none of the content seen by the tool stored or used for training.
Add Comment