ערוך

שתף באמצעות


Privacy and data security in Microsoft Copilot for Security

When you use Microsoft Copilot for Security, Customer Data and system-generated logs are stored and processed as part of the service.

Data sharing is turned on by default. Global Administrators and Security Administrators are assigned a Copilot owner role in Copilot for Security. Copilot owners can change data sharing settings for Customer Data during the first run experience and at any time thereafter. For more information on roles, see Copilot for Security roles.

Important

Microsoft recommends that you use roles with the fewest permissions. Using lower permissioned accounts helps improve security for your organization. Global Administrator is a highly privileged role that should be limited to emergency scenarios when you can't use an existing role.

This article compares Copilot for Security's Customer Data to system-generated logs, describes data sharing options, and summarizes how data is protected.

Customer Data and system-generated logs

As defined in the Microsoft Product Terms, Customer Data means all data, including all text, sound, video, or image files, and software, that are provided to Microsoft by, or on behalf of, the Customer through use of the Online Service. Customer Data doesn't include Professional Services Data or information used to configure resources in the Online Services such as technical settings and resource names.

Microsoft online services create system-generated logs as part of the regular operation of the services. System-generated logs continuously record system activity over time to allow Microsoft to monitor whether systems are operating as expected. "Logging" (the storage and processing of logs) is essential to identify, detect, respond to, and prevent operational problems, policy violations, and fraudulent activity. Logging is also essential to optimize system, network, and application performance, as well as to help with security investigations and resilience activities and to comply with laws and regulations.

The following table compares Copilot for Security's Customer Data to system-generated logs.

Customer Data System-generated logs
- Prompts that users submit to Copilot for Security.
- Information retrieved to generate responses.
- Responses.
- Content of pinned items.
- File uploads.
- Account information (tenant ID, account ID, licensing, and others).
- Usage data.
- Performance information.
- Internal system behavior information.

Customer Data sharing preferences

Data sharing is turned on by default. Copilot owners can change data sharing settings for Customer Data during the first run experience, and at any time thereafter.

Enabling or disabling these Customer Data sharing preferences described in the following table won't affect Microsoft's rights or responsibilities under the Microsoft Products and Services Data Protection Addendum.

The following data sharing options are available:

Setting Description
Allow Microsoft to capture data from Copilot for Security to validate product performance using human review Such validations include but aren't limited to:

- Ability of Copilot for Security to successfully provide responses to user requests and understand capability gaps that need to be addressed based on user prompts.

- Understand the types of tasks customers are using Copilot for Security for.

- Produce metrics surrounding the usability and quality of responses.

- Validate Copilot for Security capabilities involving other Microsoft products purchased and integrated by a customer.

- Improve responses from plugins accessing other Microsoft products.

For more information, see Set up location for prompt evaluation and opt-in (or out of) data sharing.
Allow Microsoft to capture and human review data from Copilot for Security to build and validate Microsoft's security AI model Such validations include but aren't limited to:

- Captured data is used to develop security specific models built on top of Azure OpenAI foundational model, which would power more intelligent and personalized capabilities for Copilot for Security and other Microsoft products that it integrates with.

NOTE: Data isn't shared with OpenAI or used to train the Azure OpenAI foundational model.

Accessing data from Microsoft 365 services

Copilot for Security seamless integrates with multiple Microsoft 365 and Microsoft security services that your organization has licensed. You have the option of allowing users to use those products to query information directly from those services in both the standalone and embedded experiences.

Note

Currently, Copilot for Security only accesses Microsoft 365 services data processed by Microsoft Purview, as well as Customer Data generated by Microsoft Purview (for example, DLP alerts).

In Microsoft Purview, services such as data loss prevention (DLP), Insider Risk Management (IRM), or communication compliance are configured by the admin to run on Microsoft 365 services data (or other data types).

The data types that Copilot for Security can access is dictated by what an admin has configured for Microsoft Purview.

The following table summarizes the Microsoft 365 services data accessed by Copilot for Security.

Microsoft Purview Product or Services What is accessed by Copilot for Security
Data Loss Prevention DLP alert data associated with a DLP match
Microsoft Information Protection Activity logs associated with labeling activity
eDiscovery Data captured within a review set of an eDiscovery search
Insider Risk Management IRM alert data associated with a IRM policy alert
Communication Compliance Data captured within a policy match of a Communication Compliance Policy

Microsoft 365 services data accessed by Copilot for Security, including Customer Data generated by Microsoft Purview, is processed and stored according to the data processing activities described herein. This means, Microsoft 365 data accessed by Copilot for Security will be processed and stored in location(s) described herein, regardless of the location in which the data was processed or stored pursuant to EU Data Boundary Services and the data residency commitments under the "Customer Data at Rest for Core Online Services" section of the Product Terms before the data was accessed by Copilot for Security. This also means, Microsoft 365 data accessed by Copilot for Security will be processed pursuant to the security practices and policies applicable to Copilot for Security, regardless of the security practices and policies applicable to the data under "Security Practices and Policies for Core Online Services" section of the Product Terms before the Microsoft 365 data was accessed by Copilot for Security.

To learn more about information captured, recorded, and retained by Microsoft Purview, see Learn about auditing solutions in Microsoft Purview. For information about activities that are audited in Microsoft 365, Audit log activities.

Setting Description
Allow Copilot for Security to access data from your Microsoft 365 services When turned on:

- Copilot for Security can retrieve your data from a Microsoft 365 service on your behalf if you're a customer of both Copilot for Security and the Microsoft 365 service, and you allow Copilot for Security access to your Microsoft 365 services. See the note in the prior section for more information about the Microsoft 365 services data accessed by Copilot for Security.

- All data, including Microsoft 365 data returned to answer your queries in Copilot for Security is retained pursuant to Copilot for Security's data retention policy. For more information, see Customer Data storage location.

Configuring Microsoft 365 services data access

Use the following steps to turn on or off Copilot for Security's access to Microsoft 365 services.

  1. In Copilot for Security, go to Settings > Owner settings.

  2. Update your data sharing selection.

Warning

Turning off Microsoft 365 data access does not mean that any data retrieved from those Microsoft 365 services are deleted at that instant. Microsoft 365 data accessed up to that time will be deleted pursuant to Copilot for Security's data retention policy. For more information, see Data retention and deletion.

Customer Data storage location

Customer Data is stored at rest in the home "Geo" of the tenant, if a customer hasn't opted in to data sharing. For example, a customer tenant whose home is in Germany will have their Customer Data stored in "Europe" as the designated Geo for Germany.

When data sharing is opted in, Customer Data such as prompts and responses are shared with Microsoft to enhance product performance, improve accuracy, and address response latency. In this case, Customer Data (except uploaded files) may be stored outside of the tenant Geo. While uploaded files are not stored outside of the tenant Geo, if content from uploaded files is part of information retrieved to generate responses during sessions, that retrieved content can be stored outside of the tenant Geo.

For more information, see Data residency in Azure.

File upload storage and processing

Uploaded files are always stored in the home Geo of the tenant. Uploaded files are stored in the Copilot for Security service, not inside your tenant boundary. Uploaded files are only available to the user account that uploaded them, and not available to other users within or outside the tenant.

When data sharing is opted in, Microsoft may only capture and human-review content from uploaded files when that content is part of information retrieved to generate responses.

Audit log in Microsoft Purview for Security Copilot

The audit logging capability in Microsoft Purview for Security Copilot captures the following types of data:

  • Admin events - Privileged actions such as changes to tenant-level settings or administrative changes (for example, data sharing, plugin and promptbook configurations).

  • Activity metadata - Logs of user interactions within the Security Copilot platform (for example, a user asked a prompt at a specific time with information on the activity type).

    Note

    This does not include customer content such as the actual prompt and response.

For more information, see Access the audit log.

Microsoft Purview will store your Customer Data in the region where your Microsoft 365 data is stored. For more information, see Data Residency support for Microsoft Purview. The default retention period for audit logs is 180 days, but can be extended using audit log retention policies. For more information, see Manage audit log retention policies.

Data retention and deletion

Copilot for Security stores Customer Data necessary for in-product functionality (such as your session data (for example your prompts and responses)) so long as you have an active subscription to Copilot for Security.

Customer Data can be deleted in the following scenarios:

  • When you delete all provisioned capacity
    Customer Data is deleted within 180 days of when you delete all provisioned capacity. For more information, see Delete capacity.

  • When you request for your Customer Data to be deleted
    You can also request that Copilot for Security delete your Customer Data through the portal (https://securitycopilot.microsoft.com) or by requesting deletion through customer support . This Customer Data will be deleted within 30 days of that request.

When you opt-in to sharing your Customer Data with Copilot for Security this Customer Data is only retained by Copilot for Security for 90 days before being deleted by that team evaluating that Customer Data.

If you opt out of data sharing, Copilot for Security deletes all Customer Data shared within 30 days. Customer Data is retained by you in your tenant so long as you have an active subscription to Copilot for Security and have not requested it be deleted.

Location for prompt evaluation

With any Microsoft Copilot product, prompts refer to the text-based, natural language input you provide in the prompt bar that instructs Copilot for Security to generate a response. Prompts are the primary input Copilot needs to generate answers that help you in your security-related tasks. Prompts are evaluated using GPU resources in Azure datacenters protected with Azure security and privacy controls.

You can choose to select where the prompts are evaluated from any of the following locations:

  • Australia (ANZ)
  • Europe (EU)
  • United Kingdom (UK)
  • United States (US)

You can opt in to having prompts evaluated anywhere in the world to mitigate potential disruptions in case your primary location experiences high activity. 

Microsoft recommends having prompts evaluated anywhere with available GPU capacity, which enables the Copilot system to determine the optimal location based on load, latency, and responsiveness. 

Note

Data (sessions) will always be stored within your tenant home Geo unless you opt in to Customer Data sharing. For more information, see Customer Data storage location.

Set up location for prompt evaluation and opt in (or out of) data sharing

During initial setup, Copilot owners are prompted to set data sharing and prompt evaluation options. For more information, see Get started with Copilot for Security. Copilot owners can change these settings during the first run experience, or at any time thereafter.

Authorized role
You need to be a Copilot owner to change the data sharing options. For more information on roles, see Understand authentication.

Set up data sharing

During initial setup, a Copilot owner is provided with the following data sharing options:

Setting Description
Allow Microsoft to capture data from Copilot for Security to validate product performance using human review Such validations include but aren't limited to:

- Ability of Copilot for Security to successfully provide responses to user requests and understand capability gaps that need to be addressed based on user prompts.

- Understand the types of tasks customers are using Copilot for Security for.

- Produce metrics surrounding the usability and quality of responses.

- Validate Copilot for Security capabilities involving other Microsoft products that a customer has purchased and integrated.

- Improve responses from plugins accessing other Microsoft products.
Allow Microsoft to capture and human review data from Copilot for Security to build and validate Microsoft's security AI model Such validations include but aren't limited to:

- Captured data is used to develop security specific models built on top of Azure OpenAI foundational model, which would power more intelligent and personalized capabilities for Copilot for Security and other Microsoft products that it integrates with.

NOTE: Data isn't shared with OpenAI or used to train the Azure OpenAI foundational model.
  • When you opt in to data sharing, your Customer Data is shared with Microsoft from that point forward.
  • When you opt out of data sharing, no further Customer Data is shared. Customer Data that was shared previously is retained for not more than 180 days.

Updating data sharing

  1. In Copilot for Security, go to Settings > Owner settings.

  2. Update your data sharing selection.

How Microsoft protects your data

Microsoft uses comprehensive controls to protect your data. All Copilot for Security data is handled according to Microsoft's commitments to privacy, security, compliance, and responsible AI practices. Access to the systems that house your data is governed by Microsoft's certified processes.

Copilot for Security runs queries as the user, so it never has elevated privileges beyond what the user has.

If you opt in to share Customer Data, your data is:

  • Not shared with OpenAI
  • Not used for sales
  • Not shared with third parties
  • Not used to train Azure OpenAI foundational models

Copilot for Security meets all Azure production data compliance standards.

All data stored in Azure is automatically encrypted at rest and uses AES-256 encryption. For more information, see Data encryption and Encryption at rest.

Microsoft security products data handling

Microsoft Security products purchased by you may share data, including Customer Data, as described in the product documentation. Customer Data shared with Copilot for Security is governed by the Product Terms, Data Protection Addendum, and documentation applicable to Copilot for Security. For Microsoft 365 services, an administrator needs to enable Copilot for Security in the sharing preference option detailed in Accessing data from Microsoft 365 services and users will need to enable a plugin for those Microsoft 365 Services. For other Microsoft services, such plugins are enabled by default for users. Users can turn off plugins at any time. For more information, see Manage plugins.

Feedback from Copilot for Security users

Microsoft collects feedback on the response produced by Microsoft Copilot for Security from users of the product. A Copilot owner can turn off feedback collection for their tenant by contacting Microsoft Support through a support ticket. For more information, see Contact support.

See also

Data, privacy, and security for Azure OpenAI Service

Microsoft responsible AI principles.