Microsoft Security Copilot is now generally available for purchase (as of April 1, 2024). Originally introduced alongside a range of updates and new features for Microsoft’s Copilot portfolio in March 2023, this new resource promises to enhance business security with next-level AI.

Microsoft’s latest Copilot is the industry’s first generative AI tool to support security and IT teams. The tech giant says the solution will help professionals move faster, catch threats others miss, and enhance their expertise.

Informed by large-scale data, threat intelligence, and more than 78 million security signals, Security Copilot could change how we manage risk in the modern world. In particular, it could help address the security skill gap at a time when more than 3.4 million security roles are currently left unfilled.

I examined this potentially revolutionary tool more closely to determine precisely what it can accomplish for today’s security professionals.

What is Microsoft Security Copilot?

Microsoft Security Copilot, or “Copilot for Security” is the latest generative AI solution created by Microsoft, as part of its broader copilot ecosystem. It was introduced at the inaugural Microsoft Secure Event in 2023 and became generally available to purchase in April 2024.

According to the tech giant, the solution empowers security defenders to improve outcomes at machine-level speed and scale without compromising on compliance. It provides a natural language experience, similar to Microsoft’s other Copilot tools.

Designed with a focus on flexibility and extensibility, Microsoft Security Copilot offers a standalone experience and integrates seamlessly with products in Microsoft’s broader security portfolio. For instance, it can connect with:

  • Microsoft’s Unified Security Operations Platform
  • Microsoft Defender Threat Intelligence
  • Microsoft Sentinel
  • Microsoft Defender XDR
  • Microsoft Intune
  • Microsoft Entra
  • Microsoft Purview
  • Microsoft Defender External Attack Surface Management

Plus, the solution benefits from access to Copilot’s broad ecosystem of more than 100 software partners and managed security service providers. The system also offers a multilingual interface for up to 25 languages and can respond to prompts in 8 languages.

How Does Microsoft Copilot for Security Work?

Like many of the top LLM-powered generative AI models, Microsoft Security Copilot is simple to use. You can access capabilities through the standalone experience or other Microsoft security products.

The language model and Microsoft’s proprietary technologies work together in a comprehensive system, meaning users can experiment with Security Copilot through Microsoft’s security solutions (as mentioned above) and plugins from Microsoft and third parties. Plugins can bring more context to Security Copilot from event logs, incidents, alerts, and policies.

Plus, Copilot can access authoritative content and threat intelligence through plugins that can search through Microsoft Defender articles and reports. Interacting with the solution is similar to interacting with Microsoft’s Bing or Teams Copilot tools.

Users can submit prompts to a conversational interface, which Security Copilot processes via the “grounding” process. This improves the specificity of the prompt to ensure you get answers relevant to your specific needs. The Copilot then takes the response from the language model and post-processes it before returning the response to the user.

How the AI System Works

Here’s a quick rundown of what working with Microsoft Security Copilot looks like:

  • Step 1: A user sends a prompt to the Copilot for Security from Microsoft’s security products.
  • Step 2: Copilot then pre-processes the prompt through a “grounding” process, improving the specificity of the input to ensure a relevant and actionable response.
  • Step 3: Copilot accesses your plugins for pre-processing and then sends the modified prompt to Microsoft’s language model.
  • Step 4: Copilot for Security takes the response from the language model and post-processes, gathering contextual information from plugins.
  • Step 5: The AI app returns the response to the user.

What Can Security Copilot Do?

After an initial beta testing and early access phase, Microsoft has already proven how valuable its new AI solution can be. In a “Copilot for Security” economic study, the tech giant discovered that experienced security professionals were 22% faster at analyzing threats using Copilot.

They were also 7% more accurate when completing tasks, and 97% said they wanted to continue using the solution going forward. Even novice analysts were 44% more accurate in their work.

Primarily, Microsoft Security Copilot empowers IT professionals and security teams to catch threats faster, access critical guidance in seconds to mitigate risks, and even strengthen team expertise.

Some of the primary use cases for the AI app include:

  • Incident summarization: Users can rapidly leverage generative AI to distill comprehensive security alerts into concise summaries for quicker response times.
  • Impact analysis: Microsoft’s toolkit leverages AI to assess the potential threat level of different security incidents, highlighting affected data, and systems.
  • Reverse engineer scripts: With Copilot, users don’t have to reverse engineer malware manually. They can quickly analyze complex command line scripts and translate them into easy-to-understand language, explaining each action clearly.
  • Guided response: Both expert and novice security professionals can receive step-by-step guidance from Copilot on how to respond to a threat. The bot can offer directions for triage processes, investigation, remediation, containment, and more.

The Latest Updates to the AI

Alongside announcing general availability in April 2024, Microsoft also announced a handful of new product capabilities, such as:

  • Custom prompt books: Allowing users to create, save, and share prompts for security processes with their entire team.
  • Knowledge base integrations (In preview): This allows users to integrate Copilot with business information to search for risks in proprietary content.
  • Multi-language support: Copilot can now respond to prompts in 8 languages, and the interface supports 25 languages.
  • Upgraded third-party integrations: The number of third-party plugins and integrations is currently increasing at a rapid pace. Some new plugins and integrations include Netskope, Tanium, Valence Security, Cyware, and SGNL solutions.
  • Usage reporting: Admins can now access dashboards for insights into how their teams use Copilot so that they can find opportunities for resource optimization.
  • Microsoft Entra audit logs and diagnostic logs: Users can access these features to gain additional insights into IT issues and security investigations with audit logs.
  • Defender External Attack Surface Management: This new integration allows users to identify and analyze up-to-date information on external attack surface risks.

How to Access Microsoft Security Copilot

Microsoft Copilot for Security is now available worldwide, although the company is still working on adding support for new languages.

Like most of Microsoft’s Copilot tools, the security app isn’t free to access. The Security Copilot offering is pay-as-you-go. You can use your existing Azure subscription to access Security Copilot, and Microsoft estimates most users will pay around $4 per hour for the service.

You can find the full pricing information for Security Copilot here.

Once you have your subscription, you will be prompted to take a few additional steps to enable Security Copilot. First, you need an Azure subscription (which you can create for free). Next, you’ll need to ensure you have the right “capacity” for your needs.

You can manage capacity by decreasing or increasing provisioned Security Compute Units (SCUs) for Copilot within Azure or the Copilot for Security portals.

Provisioning Capacity within Copilot for Security

Microsoft recommends simply provisioning access within the Copilot for Security portal. You’ll need at least one SCU (and the maximum is 100). If you choose this method, you’ll need to visit the Copilot for Security page here and follow the step-by-step instructions.

Notably, you will need to be an Azure owner or a resource group level contributor. Once you’ve signed into your account, click “Get Started, ” select your Azure subscription, associate capacity to your resource group, and add a name to the setting. You’ll also need to specify the required security compute units.

Microsoft will show you the estimated monthly cost for annual SCUs required. If you’re happy, select “Continue,” and Azure will deploy your resource.

Provisioning Capacity in Azure

The second option is to provision capacity in the Azure portal. To do this, you’ll need to be an Azure owner or resource group level contributor. Sign into your Azure portal and search for “Copilot for Security” in your service list.

Select Copilot, then click “Resource Groups” and “Plan.” Select “Microsoft Copilot for Security” followed by “Create,” then choose your subscription and resource group, capacity name, prompt evaluation location, and required number of SCUs.

Once again, you’ll see an estimated monthly cost, then you can click on “Review and Create”. Check all the information is correct, then select “Create” to finish your setup.

Getting Started with Microsoft Security Copilot

Once you’ve provisioned your resources for Security Copilot, you’ll need to set up a default environment. Microsoft will walk you through this process, informing you where customer data will be stored (based on location).

You’ll also need to choose the data-sharing options you want to access and review the “default roles” of people who can access the app.

From there, you can start assigning roles to your users. By default, all of the users in your tenant should have basic access to the platform. However, only people with specific permissions will be able to access security data with prompts. You can follow Microsoft’s guide on assigning roles for tips on giving each user the right permission.

One helpful thing I noticed is that Copilot for Security comes with a “start-up” tour to help guide you through the application’s use. It launches automatically when logging into the platform and shows you how different features work (like prompting).

Once fully set up, you can start experimenting with Microsoft Security Copilot either as a standalone solution (through the portal) or through embedded experiences within Microsoft’s security apps. You can also start adding integrations and plugins at this stage.

The Value of Microsoft Security Copilot

Microsoft’s trials with its new security AI demonstrate that this app is a powerful solution for boosting security and governance in the workplace, and elevating employee experience. Notably, the solution ensures people at all experience levels can turbocharge their productivity by using the intuitive interface to bring AI into their workflows.

What’s more, unlike some other AI apps, Copilot for Security automatically integrates with Microsoft’s existing security products, making it easy to take your current workflows to the next level. For instance, within the Unified Security Operations Platform, you’ll get an embedded copilot experience in the Microsoft Defender portal for SIEM (Security Information and Event Management) and XDR (Extended Detection and Response).

Copilot will automatically surface relevant details for security summaries, guide analysts through risk mitigation, and even draw insights from Microsoft threat investigation tools. You can also use:

  • Microsoft Copilot in Entra: To evaluate user risks, discover ways to automate threat prevention, and streamline resolution strategies for future identity attacks.
  • Copilot in Purview: To access concise alert summaries, natural language support, and integrated insights within investigation workflows.
  • Microsoft Copilot in Intune: To make informed decisions for endpoint management, with root cause analysis, complete device context, device configuration, and error code analysis.

Mastering AI Security with Microsoft

Microsoft is also well aware of the risks associated with generative AI adoption. Alongside introducing a security-focused copilot, the company recently enabled various features within its portfolio. For instance, security teams can now:

  • Discover AI risks: By monitoring AI usage, tracking data leaks, and monitoring which users are accessing high-risk applications.
  • Defend AI apps: By adapting the prompt libraries, teams use and augment their responses, focusing on protecting sensitive data.
  • Govern usage: By retaining and logging all interactions with AI apps across the organization and investigating new incidents.

Microsoft also announced the arrival of new out-of-the-box threat detection options for Copilot in Microsoft 365 via the Defender for Cloud Apps solution. On top of all that, the company continues expanding its security portfolio with new features and solutions.

Companies can now access Microsoft Security Exposure Management to unlock the benefits of a unified posture and attack surface management solution with automatic threat discovery and visualization. Adaptive Protection has now been introduced to Microsoft Purview, with Microsoft Entra Conditional Access, to help protect businesses against insider risks.

Microsoft Communication Compliance now offers companies access to sentiment indicators and insights to ensure businesses can rapidly pinpoint communication risks across all Microsoft apps. Plus, Microsoft Intune launched three new solutions this year for Enterprise Application Management, Microsoft Cloud KPI, and Advanced Analytics.

Security Enhancement in the Age of AI

Microsoft Security Copilot is one of the first generative AI solutions to focus specifically on the needs of the security landscape. Combining security-specific models with large language model technology could empower businesses to stay safe in an evolving threat landscape.

The Copilot solution represents a significant boost to Microsoft’s entire security portfolio. It empowers companies to leverage deeper protections with the help of innovative AI.

Microsoft says it will continue to build on this innovation. Plus, it will introduce new features to its security product collection in the months and years ahead. If you want to experiment with Microsoft Copilot for Security, you can find Microsoft’s complete guide to the software here.

 



from UC Today https://ift.tt/5lfLbtM