Microsoft Purview data security and compliance protections for Microsoft Copilot

Microsoft 365 licensing guidance for security & compliance

While AI-powered productivity tools unlock valuable insights and boosts user productivity, they also introduce new user activities and produce a lot of data. Just like other enterprise activities and data, they require security and compliance management.

The following capabilities from Microsoft Purview strengthen your data security and compliance for Microsoft Copilot for Microsoft 365:


To check whether your organization's licensing plans support these capabilities, see the licensing guidance link at the top of the page. For licensing information for Microsoft Copilot for Microsoft 365 itself, see the service description for Microsoft Copilot for Microsoft 365.

Use the following sections to learn more about how these Microsoft Purview capabilities provide additional data security and compliance controls to accelerate your organization's adoption of Microsoft Copilot. If you're new to Microsoft Purview, you might also find an overview of the product helpful: Learn about Microsoft Purview.

For more general information about security and compliance requirements for Copilot for Microsoft 365, see Data, Privacy, and Security for Microsoft Copilot for Microsoft 365.

Microsoft Purview strengthens information protection for Copilot

Copilot uses existing controls to ensure that data stored in your tenant is never returned to the user or used by a large language model (LLM) if the user doesn't have access to that data. When the data has sensitivity labels from your organization applied to the content, there's an extra layer of protection:

  • When a file is open in Word, Excel, PowerPoint, or similarly an email or calendar event is open Outlook, the sensitivity of the data is displayed to users in the app with the label name and content markings (such as header or footer text) that have been configured for the label.

  • When the sensitivity label applies encryption, users must have the EXTRACT usage right, as well as VIEW, for Copilot to return the data.

  • This protection extends to data stored outside your Microsoft 365 tenant when it's open in an Office app (data in use). For example, local storage, network shares, and cloud storage.


If you haven't already, we recommend you enable sensitivity labels for SharePoint and OneDrive and also familiarize yourself with the file types and label configurations that these services can process. When sensitivity labels aren't enabled for these services, the encrypted files that Copilot for Microsoft 365 can access are limited to data in use from Office apps on Windows.

For instructions, see Enable sensitivity labels for Office files in SharePoint and OneDrive.

Additionally, when you use Microsoft Copilot Graph-grounded chat (formerly Microsoft 365 Chat) that can access data from a broad range of content, the sensitivity of labeled data returned by Copilot for Microsoft 365 is made visible to users with the sensitivity label displayed for citations and the items listed in the response. Using the sensitivity labels' priority number that's defined in the Microsoft Purview compliance portal, the latest response in Copilot displays the highest priority sensitivity label from the data used for that Copilot chat.

Although compliance admins define a sensitivity label's priority, a higher priority number usually denotes higher sensitivity of the content, with more restrictive permissions. As a result, Copilot responses are labeled with the most restrictive sensitivity label.


If items are encrypted by Microsoft Purview Information Protection but don't have a sensitivity label, Microsoft Copilot for Microsoft 365 also won't return these items to users if the encryption doesn't include the EXTRACT or VIEW usage rights for the user.

If you're not already using sensitivity labels, see Get started with sensitivity labels.

Although DLP policies don't yet support interactions for Microsoft Copilot for Microsoft 365, data classification for sensitive info types and trainable classifiers are supported to identify sensitive data in user prompts to Copilot, and responses.

Copilot protection with sensitivity label inheritance

When you use Copilot to create new content based on an item that has a sensitivity label applied, the sensitivity label from the source file is automatically inherited, with the label's protection settings.

For example, a user selects Draft with Copilot in Word and then Reference a file. Or a user selects Create presentation from file in PowerPoint. The source content has the sensitivity label Confidential\Anyone (unrestricted) applied and that label is configured to apply a footer that displays "Confidential". The new content is automatically labeled Confidential\Anyone (unrestricted) with the same footer.

To see an example of this in action, watch the following demo from the Ignite 2023 session, Getting your enterprise ready for Microsoft 365 Copilot. The demo shows how the default sensitivity label of General is replaced with a Confidential label when a user drafts with Copilot and references a labeled file. The information bar under the ribbon informs the user that content created by Copilot resulted in the new label being automatically applied:

If multiple files are used to create new content, the sensitivity label with the highest priority is used for label inheritance.

As with all automatic labeling scenarios, the user can always override and replace an inherited label (or remove, if you're not using mandatory labeling).

Microsoft Purview protection without sensitivity labels

Even if a sensitivity label isn't applied to content, services and products might use the encryption capabilities from the Azure Rights Management service. As a result, Copilot for Microsoft 365 can still check for the VIEW and EXTRACT usage rights before returning data and links to a user, but there's no automatic inheritance of protection for new items.


You'll get the best user experience when you always use sensitivity labels to protect your data, and encryption is applied by a label.

Examples of products and services that can use the encryption capabilities from the Azure Rights Management service without sensitivity labels:

  • Microsoft Purview Message Encryption
  • Microsoft Information Rights Management (IRM)
  • Microsoft Rights Management connector
  • Microsoft Rights Management SDK

For other encryption methods that don't use the Azure Rights Management service:

  • S/MIME protected emails won't be returned by Copilot, and Copilot isn't available in Outlook when an S/MIME protected email is open.

  • Password-protected documents can't be accessed by Copilot for Microsoft 365 unless they're already opened by the user in the same app (data in use). Passwords aren't inherited by a destination item.

As with other Microsoft 365 services, such as eDiscovery and search, items encrypted with Microsoft Purview Customer Key or your own root key (BYOK) are supported and eligible to be returned by Copilot for Microsoft 365.

Copilot honors existing protection with the EXTRACT usage right

Although you might not be very familiar with the individual usage rights for encrypted content, they've been around a long time. From Windows Server Rights Management, to Active Directory Rights Management, to the cloud version that became Azure Information Protection with the Azure Rights Management service.

If you've ever received a "Do Not Forward" email, it's using usage rights to prevent you from forwarding the email after you've been authenticated. As with other bundled usage rights that map to common business scenarios, a Do Not Forward email grants the recipient usage rights that control what they can do with the content, and it doesn't include the FORWARD usage right. In addition to not forwarding, you can't print this Do Not Forward email, or copy text from it.

The usage right that grants permission to copy text is EXTRACT, with the more user-friendly, common name of Copy. It's this usage right that determines whether Copilot for Microsoft 365 can display text to the user from encrypted content.


Because the Full control (OWNER) usage right includes all the usage rights, EXTRACT is automatically included with Full control.

When you use the Microsoft Purview compliance portal to configure a sensitivity label to apply encryption, the first choice is whether to assign the permissions now, or let users assign the permissions. If you assign now, you configure the permissions by either selecting a predefined permission level with a preset group of usage rights, such as Co-Author or Reviewer. Or, you can select custom permissions where you can individually select available usage rights.

In the Microsoft Purview compliance portal, the EXTRACT usage right is displayed as Copy and extract content(EXTRACT). For example, the default permission level selected is Co-Author, where you see Copy and extract content(EXTRACT) is included. As a result, content protected with this encryption configuration can be returned by Copilot for Microsoft 365:

Configration of usage rights for a sensitivity label where the permissions include EXTRACT.

If you select Custom from the dropdown box, and then Full control(OWNER) from the list, this configuration will also grant the EXTRACT usage right.


The person applying the encryption always has the EXTRACT usage right, because they are the Rights Management owner. This special role automatically includes all usage rights and some other actions, which means that content a user has encrypted themselves is always eligible to be returned to them by Copilot for Microsoft 365. The configured usage restrictions apply to other people who are authorized to access the content.

Alternatively, if you select the encryption configuration to let users assign permissions, for Outlook, this configuration includes predefined permissions for Do Not Forward and Encrypt-Only. The Encrypt-Only option, unlike Do Not Forward, does include the EXTRACT usage right.

When you select custom permissions for Word, Excel, and PowerPoint, users select their own permissions in the Office app when they apply the sensitivity label. They're informed that from the two selections, Read doesn't include the permission to copy content, but Change does. These references to copy refer to the EXTRACT usage right. If the user selects More Options, they can add the EXTRACT usage right to Read by selecting Allow users with read access to copy content.

User dialog box to select permissions that include EXTRACT usage right.


If you need to check whether a document you're authorized to view includes the EXTRACT usage right, open it in the Windows Office app and customize the status bar to show Permissions. Select the icon next to the sensitivity label name to display My Permission. View the value for Copy, which maps to the EXTRACT usage right, and confirm whether it displays Yes or No.

For emails, if the permissions aren't displayed at the top of the message in Outlook for Windows, select the information banner with the label name, and then select View Permission.

Copilot honors the EXTRACT usage right for a user, however it was applied to the content. Most times, when the content is labeled, the usage rights granted to the user match those from the sensitivity label configuration. However, there are some situations that can result in the content's usage rights being different from the applied label configuration:

For more information about configuring a sensitivity label for encryption, see Restrict access to content by using sensitivity labels to apply encryption.

For technical details about the usage rights, see Configure usage rights for Azure Information Protection.

Microsoft Purview supports compliance management for Copilot

Use Microsoft Purview compliance capabilities to support your risk and compliance requirements for Copilot for Microsoft 365.

Interactions with Copilot can be monitored for each user in your tenant. As such, you can use Purview's classification (sensitive info types and trainable classifiers), content search, communication compliance, auditing, eDiscovery, and automatic retention and deletion capabilities by using retention policies.

For communication compliance, you can analyze user prompts and Copilot responses to detect inappropriate or risky interactions or sharing of confidential information. For more information, see the Microsoft Copilot for Microsoft 365 section in Detect channel signals with communication compliance.


For auditing, details are captured when users interact with Copilot. Events include how and when users interact with Copilot, in which Microsoft 365 service the activity took place, and references to the files stored in Microsoft 365 that were accessed during the interaction. If these files have a sensitivity label applied, that's also captured. In the Audit solution from the Microsoft Purview compliance portal, select Copilot activities and Interacted with Copilot. You can also select Copilot as a workload.

Auditing options to identify user interactions with Microsoft Copilot for Microsoft 365.

For content search, because user prompts to Copilot and responses from Copilot are stored in a user's mailbox, they can be searched and retrieved when the user's mailbox is selected as the source for a search query. Select and retrieve this data from the source mailbox by selecting Add condition > Type > Copilot interactions.

Similarly for eDiscovery, you use the same query process to select mailboxes and retrieve user prompts to Copilot and responses from Copilot. After the collection is created and sourced to the review phase in eDiscovery (Premium), this data is available for performing all the existing reviewing actions. These collections and review sets can then further be put on hold or exported. If you need to delete this data, see Search for and delete data for Microsoft Copilot for Microsoft 365.

For retention policies that support automatic retention and deletion, user prompts to Copilot and responses from Copilot are identified by the location Teams chats and Copilot interactions. Previously named just Teams chats, users don't need to be using Teams chat for this policy to apply to them. Any existing retention policies previously configured for Teams chats now automatically include user prompts and responses to and from Microsoft Copilot for Microsoft 365:

Updated Teams chats retention location to include interactions for Microsoft Copilot for Microsoft 365.

For detailed information about this retention works, see Learn about retention for Microsoft Copilot for Microsoft 365.

As with all retention policies and holds, if more than one policy for the same location applies to a user, the principles of retention resolve any conflicts. For example, the data is retained for the longest duration of all the applied retention policies or eDiscovery holds.

For retention labels to automatically retain files referenced in Copilot, select the option for cloud attachments with an auto-apply retention label policy: Apply label to cloud attachments and links shared in Exchange, Teams, Viva Engage, and Copilot. As with all retained cloud attachments, the file version at the time it's referenced is retained.

Updated cloud attachments option for auto-apply retention label to include interactions for Copilot.

For detailed information about how this retention works, see How retention works with cloud attachments.

For configuration instructions:

Other documentation for Copilot

For more detailed information, see Considerations for deploying Microsoft Purview data security and compliance protections for Copilot.

To learn more about Copilot for Microsoft 365 and how your organization can use this copilot for work, see the Microsoft Copilot for Microsoft 365 documentation.

To learn how to apply Zero Trust to Copilot for Microsoft 365, see Apply principles of Zero Trust to Microsoft Copilot for Microsoft 365.