Microsoft has outlined its belief in users actively collaborating with AI, stating that generative AI’s responses can be “usefully wrong”.

In a blog about Microsoft’s philosophy on AI, the tech giant suggested that generative AI should be considered a creative collaborator rather than a basic answer-providing machine. Microsoft noted that if a user were to ask its AI-powered productivity tool Copilot for 20 ideas to tackle a work challenge, the answers offered would be imperfect but valuable, just as if they were suggestions by a colleague or friend in reality.

“At Microsoft, we like to say that this is when AI is ‘usefully wrong’,” the company wrote.

It’s a different, more collaborative way of working with technology—one that upends our long-held assumptions about what computers can and can’t do. We’re used to one interaction with a computer: put in a query, get an answer. With AI, the answer isn’t the final word; the magic is in the conversation, the back and forth. And it’s up to people to build on, combine, or transform the content into something original and meaningful.”

Microsoft implied that AI’s “usefully wrong” answer could offer unexpected or unorthodox ideas that challenge traditional thinking. This is perhaps where Microsoft was keen to distinguish between an odd and original suggestion and an AI “hallucination”, which involves misinformation or lies that could problematise users’ leveraging of those suggestions.

“AI can offer up content that is unexpected, surprising, or just plain weird,” the blog said. “Don’t let that discourage you. Instead, embrace the challenge of approaching a problem from a different perspective.”

An additional aspect of Microsoft’s philosophy on AI is its assertion that it should be used to improve human workflows rather than replace them, and it highlighted vigilance in checking every answer or suggestion it provides: “AI isn’t a replacement for your creativity or judgment; it’s a tool that can enhance and augment those innately human skills.”

Microsoft suggested that users consider relevance, appropriateness, and personality in AI answers and asked users to evaluate how they might improve AI answers constantly — illustrating the company’s concept of AI as a collaborative tool that demands user refining, regardless of how sophisticated AI’s models are and will eventually become.

What’s the Latest on Copilot?

The consumer-focused Windows Copilot is out now and is “seamlessly available across all the apps and experiences you use most”, as described by Microsoft CEO Satya Nadella, including Office 365, Bing and Windows. It can be accessed via the taskbar with the Win+C keyboard shortcut, offering assistance alongside every app. Copilot in Windows will feature the new Copilot icon, the new Copilot user experience, and Bing Chat. It is available to commercial customers for free.

Business customers will have to wait until November 1 for 365 Copilot, when it becomes available for customers on certain business and enterprise plans.

What 365 Copilot provides beyond Microsoft Copilot is commercial data protection, guaranteed security, privacy and compliance, the AI-powered Microsoft 365 Chat, and integration across the Microsoft 365 Apps. Copilot will cost $30 per user per month and will be available for users with Microsoft 365 E3, E5, Business Standard and Business Premium users when it becomes generally available.

Earlier this week, Microsoft announced the “next-generation” of OneDrive, which encompasses new file views, governance controls, creation tools, and Copilot. From December, Copilot will be available for Microsoft 365 subscribers to search, organise, and retrieve information from their OneDrive files.

Last month, Microsoft promised Copilot customers legal protections around copyright.

As concerns over the legal risks of how AI extrapolates copyright-protected IP continue to mount, the Copilot Copyright Commitment aims to address questions around IP infringement among those planning on signing up for Microsoft’s AI-powered productivity tool.



from UC Today https://ift.tt/4tOpCI3