Microsoft Copilot for Security Frequently Asked Questions
General information
What is Microsoft Copilot for Security?
Microsoft Copilot for Security is an AI cybersecurity product that enables security professionals to respond to threats quickly, process signals at machine speed, and assess risk exposure in minutes.
How is Microsoft Copilot for Security different from other AI security products?
Copilot for Security combines advanced GPT4 models from OpenAI with everything that Microsoft brings to the table, including hyperscale infrastructure, cyber-specific orchestration, Microsoft Security's unique expertise, global threat intelligence, and comprehensive security products.
It's currently the only security AI solution built using Microsoft’s unique relationship with OpenAI, giving customers access to the latest and most advanced large language models (LLMs) and Microsoft’s hyperscale AI infrastructure.
Microsoft is in a unique position to transform security for our customers, not only because of our investments in AI, but also because we offer end-to-end security, identity, compliance, and much more across our portfolio. We're optimized to cover more threat vectors and deliver value with a coordinated experience.
What are the use cases and capabilities that Copilot for Security unlocks for customers?
The hero use cases are SOC scenarios, specifically around incident summarization, impact analysis, reverse engineering of scripts, and guided responses.
Does Microsoft Copilot for Security work with other Microsoft products?
Yes. Copilot for Security works with other Microsoft Security products. These products include, but aren't limited to:
- Microsoft Defender XDR,
- Microsoft Sentinel,
- Microsoft Intune,
- Microsoft Defender Threat Intelligence,
- Microsoft Purview
- Microsoft Defender Attack Surface Management
Copilot for Security can access data from these products and provide an assistive Copilot experience to increase the effectiveness and efficiency of security professionals using those solutions. For example, capabilities such as script analysis enable customers to analyze hundreds of lines of code and interpret them via natural language in minutes. This capability drastically surpasses even advanced analyst skills in terms of both speed and expertise. Copilot for Security helps security professionals discover risks earlier, respond to them with greater guidance, and remain on top of vulnerabilities in the evolving threat landscape.
How does Copilot for Security make Microsoft Defender XDR and Microsoft Sentinel better?
Microsoft Defender XDR and Microsoft Sentinel become even more powerful when security professionals use Copilot for Security. Copilot for Security delivers an experience that enriches and builds on the security data, signals, and existing incidents and insights sourced from Microsoft Defender XDR and Microsoft Sentinel. The new, embedded experience in Microsoft Defender XDR supercharges security teams with generative AI capabilities to take their efficiency to a new level for the following set of powerful use cases:
Respond to threats at the speed of AI with assisted incident investigation and response. With the embedded experience in Defender, Copilot for Security provides summaries for active incidents and actionable step-by-step guidance for incident response, creating complete post-response activity, all in seconds and at the click of a button.
Scale advanced tasks to all skill levels. Copilot for Security enables defenders at all skill levels to discover threats and vulnerabilities across multiple threat vectors with ease. The solution reasons in real time across security data and delivers an accessible way to perform advanced tasks using natural language.
Perform malicious code analysis in real time. Previously, malware analysis and reverse-engineering were limited to advanced responders. With Copilot for Security, customers can analyze complex command line scripts and translate them into easily comprehensible natural language to help analysts understand actions and motivations of attackers.
Apply threat intelligence into your investigation workflows with ease. With Copilot for Security, users can gain structured and contextualized insights into emerging threats, attack techniques, and whether an organization is exposed to a specific threat. Copilot for Security helps prevent exposure to activity group campaigns and respond to incidents with greater guidance.
Does Copilot for Security replace Microsoft Defender XDR and Microsoft Sentinel?
No, Copilot for Security doesn't replace Microsoft Defender XDR or Microsoft Sentinel. Copilot for Security assists security professionals in their day-to-day work, providing an upskilling experience and increased efficiency. It adds increased value over Microsoft Defender XDR and Microsoft Sentinel.
Does Copilot for Security include access to Microsoft Defender Threat Intelligence (Defender TI)?
Yes*. When prompted, Copilot for Security reasons over all content and data in Microsoft Defender Threat Intelligence (Defender TI) to return crucial context around activity groups, tooling, and vulnerabilities. Customers also have tenant-level Defender TI premium workbench access, enabling them to access Defender TI's full breadth of intelligence - Intel profiles, threat analysis, internet data sets, and more - to do a deeper dive into the content surfaced in Copilot for Security.
*This access doesn't include the Defender TI API, which remains separately licensed.
When is Copilot for Security generally available?
Copilot for Security will be generally available for purchase on April 1, 2024.
Who are the intended users of Copilot for Security?
SOC analysts, compliance analysts, and IT admins are the intended users of Copilot for Security.
What languages are supported?
Copilot for Security supports multiple languages. The model is available in eight languages* and the user experience is available in 25 languages.**
*Model: English, German, Japanese, Spanish, French, Italian, Portuguese, and Chinese
**UX: Above languages plus Korean, Dutch, Swedish, Polish, Norwegian, Turkish, Danish, Finnish, and more in the user experience.
For more information, see Supported languages.
Will Early Access Program (EAP) customers receive GA features on April 1, and any other features that are added before their EAP agreement ends?
Yes, EAP customers will receive the features that are in the GA product and any other feature updates that occur during their EAP agreement time.
What happens to customers that are participating in the Early Access Program at GA?
Customer access under the Early Access Program will end six months after their purchase date. The contractual terms applicable to customers' use and consumption of the product under the Early Access Program applies during this period. After this time and after GA, Copilot of Security will be available for purchase and consumption under Microsoft's standard contracting channels. A migration plan is in place to support those customers migrating from EAP to GA to ensure that all their information is carried over to the GA Product.
When an EAP customer decides to purchase the GA product within 90 days of EAP expiry, Customer Data will remain available.
If an EAP customer decides not to purchase the GA product within 90 days of EAP expiry, their EAP Customer Data is deleted in accordance with our data retention policies.
Is Customer Data used to train Azure OpenAI Service foundation models?
No, Customer Data isn't used to train Azure OpenAI Service foundation models and this commitment is documented in our Product Terms. For more information on data sharing in the context of Copilot for Security, see Privacy and data security.
What is the GDPR Guidance for EU Markets?
Microsoft complies with all laws and regulations applicable to its providing the Products and Service including security breach notification law and Data Protection Requirements (as defined in the Microsoft DPA). However, Microsoft isn't responsible for compliance with any laws or regulations applicable to Customer or Customer's industry that aren't generally applicable to information technology service providers. Microsoft doesn't determine whether Customer's data includes information subject to any specific law or regulation. For more information, see Microsoft Products and Services Data Protection Addendum (DPA).
Are US Government Cloud (GCC) customers eligible?
GCC isn't available at GA. At this time, Copilot for Security isn't designed for customer usage with US Government clouds, including, but not limited to GCC, GCC High, DoD, and Microsoft Azure Government. While the technical path for Microsoft Sentinel connector works, the tenant is unable to access 75% of the product features because the Defender interfaces and data live within Microsoft Azure Government, which Copilot isn't integrated into.
Are US and Canada health care customers eligible?
US and Canada HLS customers are eligible to purchase Copilot for Security. Microsoft Copilot for Security is now listed and covered by Business Associate Agreement ("BAA"), which is important to healthcare providers who are subject to regulations under HIPAA. For additional information on compliance offerings currently covered for Microsoft Copilot for Security can be found in the Service Trust Portal.
How do I export or delete data from Copilot for Security?
You will need to contact support. For more information, see Contact support.
What is the difference between ChatGPT and Copilot for Security?
ChatGPT and Copilot for Security are both artificial intelligence (AI) technologies that were developed with the intent of helping users accomplish tasks and activities faster and more efficiently. While they might seem similar, there are significant differences between the two.
ChatGPT is a natural language processing technology. ChatGPT uses machine learning, deep learning, natural language understanding, and natural language generation to answer questions or respond to conversations. ChatGPT works off data trained from the Internet, uses prompts from users to aid in prompt engineering and model adjustments, and is limited to three concurrent plugins.
Copilot for Security is a natural language, AI-powered security analysis tool designed to help organizations defend against threats at machine speed and scale. Copilot for Security is built on OpenAI technology and is designed and engineered as an enterprise cyber AI from the ground up. The platform works off of customer connected plugins and Microsoft's global threat intelligence as grounding data. Entered prompts don't inform the model or prompt engineering unless submitted by the customer for review.
A key difference between ChatGPT and Copilot for Security is what the systems are designed to accomplish. Microsoft Copilot for Security is designed for posture management, incident response, and reporting. The solution draws insights from security signals aggregated from plugins, while ChatGPT works like a chatbot designed to hold a conversation with a user.
Copilot for Security has access to up-to-date information from threat intelligence and draws insights from plugins so that security professionals are better equipped at defending against threats. Microsoft Copilot for Security doesn't always get everything right and as with all AI tools, responses can contain mistakes. The built-in feedback mechanism provides users with control in helping improve the system.
Purchase information
How can customers purchase at GA?
Copilot for Security is available across all channels: EA, MCA-E, CSP, Buy Online, and legacy Web Direct.
Are there any prerequisites to purchase?
An Azure subscription and Microsoft Entra ID (formerly known as Azure Active Directory) are prerequisites for using Copilot for Security; there are no other product prerequisites. For more information, see Get started with Copilot for Security.
How are provisioned SCUs billed?
Copilot for Security is sold in a provisioned capacity model and is billed by the hour. You can provision Security Compute Units (SCUs) and increase or decrease them at any time. Billing is calculated on hourly blocks rather than by 60-minute increments and has a minimum of one hour. Any usage within the same hour is billed as a full SCU, regardless of start or end times within that hour. For instance, if you provision an SCU at 9:05 am, then deprovision it at 9:35 am, and then provision another SCU at 9:45 am, you'll be charged for two units within the 9:00 a.m. to 10:00 a.m. hour. Similarly, if you provision an SCU at 9:45 a.m., you'll only have 15 minutes to use it before it's no longer available, as SCUs are provided in hourly blocks from 9:00 a.m. to 10:00 a.m. To maximize usage, make SCU provisioning changes at the beginning of the hour. For more information, see Manage usage.
Technical and product questions
Do customers receive onboarding support?
Upon purchase, customers receive easy access to documentation, videos, and blogs.
Where can I find more information on Data Protection and Privacy?
You can learn more at the Microsoft Trust Center.
What are the Compliance Offerings for Microsoft Copilot for Security?
Microsoft Copilot for Security achieved compliance certification including ISO27001, 27018, 27017, 27701, 20000-1, 9000-1and 22301. In addition, Microsoft Copilot for Security is now listed and covered by Business Associate Agreement ("BAA"), which is important to healthcare providers who are subject to regulations under HIPAA. For additional information on compliance offerings currently covered for Microsoft Copilot for Security can be found in the Service Trust Portal.
For more information, see Service Trust Portal.
Is deployed Microsoft Entra ID (formerly known as Azure Active Directory) a requirement for Copilot for Security?
Yes. Copilot for Security is a SaaS application and requires AAD to authenticate the users who have access.
Does Defender for XDR and Sentinel integration cover stored data or only alerts and incidents (notable events)?
Copilot for Security covers both alerts/incidents and stored data. The product retrieves data from advanced hunting tables in Defender for XDR and top data tables in Microsoft Sentinel.
Does Copilot for Security support tenant or subscription transfers?
No, at this time Copilot for Security doesn't support moving Copilot for Security resources across Microsoft Entra tenants or subscription transfers.
What partner tools are integrated with Copilot for Security?
Customers can use ISV developed third-party plugins such as Cyware, Netskope SGNL, Tanium, and Valence Security in Public Preview capacity. Microsoft developed third-party plugins such as CIRC.lu, CrowdSec, Greynoise, and URLScan are also available. More plugins are going to be added in future.
Note
Products that integrate with Copilot for Security need to be purchased separately.
Can Copilot for Security isolate machines using Microsoft Defender for Endpoint and Microsoft Intune? Can you customize and/or block individual IOCs?
Copilot for Security can't isolate machines. It can provide recommendations to security admins that they should isolate certain machines, but Copilot for Security doesn't take that action itself. You can evaluate whether individual IOCs are present in the environment using Copilot for Security. However, Copilot for Security doesn't automatically block them. Automation may come at a later date, but for now Copilot for Security doesn't take remediation action on its own.
Is Copilot for Security IPv6 aware?
There's a capability called Get Web Components by IP Address that currently supports IPV4.
Does Copilot for Security make recommendations for IoT/OT scenarios?
No, Copilot for Security doesn't currently support IoT/OT.
Does Copilot for Security offer dashboarding, or can you only investigate single events?
Copilot for Security doesn't provide dashboarding, however you're able to query multiple incidents across Microsoft Sentinel. For the baseline, it can provide a visualization of an attack path.
Can Copilot for Security execute workflows - from triaging to using pinned messages, to governing how the customer should label the incident and whether an incident should be closed?
No, workflows are currently not supported in Copilot for Security.
What role-based access control or delegation features does Copilot for Security have? How are user permissions kept in Copilot for Security aligned to user permission configurations in other solutions?
Copilot for Security uses "admin on behalf of" (AOBO) rights for the user that is logged in. For more information, see Understand authentication.
Why does Microsoft Copilot transfer data to a Microsoft tenant?
Microsoft Copilot is a SaaS (Software as a Service) offering that runs in the Azure production tenant. Users enter prompts and Copilot for Security provides responses based on the insights sourced from other products such as Microsoft Defender XDR, Microsoft Sentinel, and Microsoft Intune. Copilot for Security stores past prompts and responses for a user. The user can use the in-product experience to access prompts and responses. Data from a customer is logically isolated from the data of other customers. This data doesn't leave the Azure production tenant and is stored until customers ask to delete them or offboard from the product.
How is the transferred data secured in transit and at rest?
The data is encrypted both in transit and at rest as described in the Microsoft Products and Services Data Protection Addendum.
How is the transferred data protected from unauthorized access and what testing was done for this scenario?
By default, no human users have access to the database and the network access is restricted to the private network where the Microsoft Copilot application is deployed. If a human needs access to respond to an incident, then the on-call engineer needs elevated access and network access approved by authorized Microsoft employees.
Apart from regular feature testing, Microsoft also completed penetration testing. Microsoft Copilot for Security complies with all the Microsoft Privacy, security and compliance requirements.
In "My Sessions" when an individual session is deleted, what happens to the session data?
Session data is stored for runtime purposes (to operate the service), and also in logs. In the runtime database, when a session is deleted via the in-product UX, all data associated with that session is marked as deleted and the time to live (TTL) is set to 30 days. After that TTL expires, queries can't access that data. A background process physically deletes the data after that time. In addition to the 'live' runtime database, there are periodic database backups. The backups will age out – these have short-lived retention periods (currently set to four days).
Logs, which contain session data aren't affected when a session is deleted via the in-product UX. These logs have a retention period of up to 90 days.
What Product Terms apply to Copilot for Security? Is Copilot for Security a "Microsoft Generative AI Service" within the meaning of Microsoft's Product Terms?
The following Product Terms govern Copilot for Security customers:
Universal License Terms for Online Services terms in the Product Terms, which include the Microsoft Generative AI Services terms and the Customer Copyright Commitment.
Privacy & Security Terms in the Microsoft Product Terms, which include the Data Protection Addendum.
Copilot for Security is a Generative AI Service within the definition of the Product Terms. Additionally, Copilot for Security is a "Covered Product" for purposes of the Customer Copyright Commitment. At this time, in the Product Terms there are no product-specific terms unique to Copilot for Security.
In addition to the Product Terms, customers' MBSA/EA and MCA agreements, for example, govern the parties' relationship. If a customer has specific questions about its agreements with Microsoft, engage the CE, the deal manager, or the local CELA supporting the deal.
What is the Microsoft Copilot Copyright Commitment?
The Microsoft Customer Copyright Commitment is a new commitment that extends Microsoft's existing intellectual property indemnity support to certain commercial Copilot services. The Customer Copyright Commitment applies to Copilot for Security. If a third party sues a commercial customer for copyright infringement for using Microsoft's Copilots or the output they generate, Microsoft will defend the customer and pay the amount of any adverse judgments or settlements that result from the lawsuit, provided that the customer used the guardrails and content filters built into our products.
Can Copilot for Security customers opt out of Azure OpenAI Service abuse monitoring? Does Copilot for Security engage in any content filtering or abuse monitoring?
Azure OpenAI abuse monitoring is currently disabled service-wide for all customers.
Does Copilot for Security make any location of data processing or data residency commitments?
Location of data processing: At this time, Copilot for Security doesn't make any contractual location of data processing commitments. Under the Product Terms and by using a Microsoft Generative AI Service, Copilot for Security customers agree that their data may be processed outside of its tenant's geographic region. However, Customer administrators can select the location for prompt evaluation. While Microsoft recommends allowing prompt evaluation anywhere with available GPU capacity for optimal results, customers may select from four regions to have their prompts evaluated solely in those regions
Currently Available Regions for prompt evaluation:
- Australia (ANZ)
- Europe (EU)
- United Kingdom (UK)
- United States (US).
Location of data storage (data residency): At this time, Copilot for Security doesn't make contractual data storage/residency commitments. If a customer isn't opted into data sharing*, Copilot for Security stores data at rest in the home Geo of the tenant.
For example, for a customer tenant whose home is in Germany, Copilot for Security stores Customer Data in “Europe” as the designated Geo for Germany.
*If a customer opts into data sharing, Customer Data such as prompts and responses are shared with Microsoft to enhance product performance, improve accuracy, and address response latency. When this event occurs, Customer Data such as prompts can be stored outside of the tenant Geo.
Is Copilot for Security a Microsoft EU Data Boundary service?
At the time of GA, all Microsoft Security Services are out of scope for EU data residency requirements and Copilot for Security won't be listed as an EUDB service.
Where is EU customer data stored?
Copilot for Security stores Customer Data and Personal Data such as user prompts and Microsoft Entra Object IDs in the tenant Geo. If a customer provisions their tenant in the EU and isn't opted in to data sharing, all Customer Data and pseudonymized personal data are stored at rest within the EU. Processing of Customer Data and Personal Data prompts can occur in the designated Security GPU Geo. For more information on Security GPU geography selection, see Get Started with Copilot for Security. If a customer is opted in to data sharing, prompts can be stored outside of the EU Data Boundary. For more information on data sharing, see Privacy and data security in Microsoft Copilot for Security.
Are customer prompts (such as input content from the customer) considered Customer Data within the terms of the DPA and the Product Terms?
Yes, customer prompts are considered Customer Data. Under the Product Terms, customer prompts are considered Inputs. Inputs are defined as "all Customer Data that Customer provides, designates, selects, or inputs for use by a generative artificial intelligence technology to generate or customize an output".
Is "Output Content" considered Customer Data within the terms of the DPA and the Product Terms?
Yes, Output Content is Customer Data under the Product Terms.
Is there a transparency note or transparency documentation for Copilot for Security?
Yes, the Responsible AI transparency document can be found here: Responsible AI FAQ.
How is Copilot for Security dealing with a "token limit"?
Large language models (LLMs) including GPT have limits on how much information they can process at once. This limit is known as a "token limit", and roughly correlates to 1.2 words per token. Copilot for Security uses the latest GPT models from Azure OpenAI to ensure we can process as much information as possible in a single session. In some cases, large prompts, long sessions, or verbose plugin output may overflow the token space. When this scenario happens, Copilot for Security attempts to apply mitigations to ensure an output is always available, even if the content in that output isn't optimal. Those mitigations aren't always effective, and it might be necessary to stop processing the request and direct the user to try a different prompt or plugin.
The Azure OpenAI Service code of conduct includes "Responsible AI Mitigation Requirements". How do those requirements apply to Copilot for Security customers?
These requirements don't apply to Copilot for Security customers because Copilot for Security implements these mitigations.
Partner information
What are the use cases for Partners?
Partners can provide signals or build complementary solutions around Copilot for Security scenarios.
I work with a managed security service provider (MSSP). Can they use and manage Copilot for Security on my behalf?
Yes, MSSPs that provide SOC services for customers are able to access the customer's Copilot for Security environment if the customer elects to provide access. (Bring your Own MSSP).
Note
Available in the standalone portal only with limited capability.
Onboarding and managing MSSPs access to your (customer) tenant is offered via Guest Access (B2B) and GDAP. There currently isn't a CSP or reseller multitenant model for MSSPs. Each customer is responsible for purchasing their own SCUs and setting up their MSSPs with the necessary access.
Can MSSPs, use a single instance of Copilot for Security to manage multiple tenants?
Copilot for Security doesn't support prompting across multiple tenants. Instead, MSSPs can use Tenant Switching and target one customer tenant, using supported delegated access options.
MSSPs can manage multiple customer tenants within Copilot for Security by using Tenant Switching from their MSSP tenant. Tenant Switching allows for partners to switch between tenants and secure one tenant at a time with prompts/queries by selecting the customer tenant in a dropdown. Customers can also add the TenantID (GUID) in the Copilot for Security session URL as a query string parameter. Customer tenants can be selected based on delegated access to the customer tenant. Copilot for Security is used under the user context, so the partner is only able to access what the delegated account has been given access to in the customer tenant.
Are there third-party integrations available today?
We're working with ISVs like Cyware, Netskope, SGNL, Tanium, and Valence Security to release their plugins in Public Preview and continue to build more integrations with the rest of the Partner ecosystem.
Is there a marketplace for the plugins or services?
There isn't a plugin marketplace at GA. ISVs can publish their solutions to GitHub. At GA, all partners are required to publish their solutions or managed services to the Microsoft Commercial marketplace. More information on publishing to marketplace can be found here:
Publish solution to the Microsoft Commercial Marketplace:
- Create a commercial marketplace account in Partner Center - Partner Center
- How to manage a listing in Partner Center
MSSP Specific: Must have a security designation in Microsoft AI Cloud Partner Program.
SaaS Specific:
- Plan a SaaS offer for the Microsoft commercial marketplace - Marketplace publisher
- Create a SaaS offer in the commercial marketplace - Marketplace publisher
- ISV Success Program Overview
- Build and publish with ISV Success - Partner Center
What about MSSPs who are also participating in Copilot for Security EAP as a customer?
MSSPs who are part of EAP will continue to have access to Copilot for Security until the EAP 6-month agreement ends. If an MSSP is using the same tenant for internal and customer managed SOC services, they need to purchase a capacity plan and ensure that supported delegated access models are enabled through the managed SOC used tenant. If an MSSP is using a tenant only for customer managed SOC services that isn't used for internal SOC managed services, the MSSP needs to have the tenant manually provisioned for Copilot for Security by Microsoft. So, no capacity plan purchase is needed for provisioning of Copilot for Security.
What if MSSPs aren't using Microsoft Defender XDR or Microsoft Sentinel?
Microsoft Copilot for Security doesn't have any specific Microsoft security product requirement for provisioning or use since the solution is built on aggregating data sources both from Microsoft and third party services. With that said, there's a significant value in having Microsoft Defender XDR and Microsoft Sentinel enabled as supported plugins for enriching investigations. Copilot for Security only uses skills and accesses data from enabled plugins.
Does an MSSPs SOC Solution need to be hosted on Azure?
It's recommended that the solution is hosted on Azure but not required.
Is there a product roadmap that can be shared with Partners?
Not at this time.