Microsoft Purview data security and compliance protections for Microsoft Copilot
Microsoft 365 licensing guidance for security & compliance, Microsoft Purview Audit service description, Microsoft Purview eDiscovery service description
While AI-powered productivity tools unlock valuable insights and boosts user productivity, they also introduce new user activities and produce a lot of data. Just like other enterprise activities and data, they require security and compliance management.
The following capabilities from Microsoft Purview strengthen your data security and compliance for Microsoft Copilot for Microsoft 365:
- Sensitivity labels and content encrypted by Microsoft Purview Information Protection
- Data classification
- Customer Key
- Communication compliance
- Auditing
- Content search
- eDiscovery
- Retention and deletion
- Customer Lockbox
Use the following sections to learn more about how Microsoft Purview integration provides additional data security and compliance controls to accelerate your organization's adoption of Copilot.
For licensing information to use these capabilities for Copilot for Microsoft 365, see the licensing and service description links at the top of the page. For licensing information for Copilot for Microsoft 365, see the service description for Microsoft Copilot for Microsoft 365.
For more general information about security and compliance requirements for Copilot for Microsoft 365, see Data, Privacy, and Security for Microsoft Copilot for Microsoft 365.
Microsoft Purview strengthens information protection for Copilot
Copilot uses existing controls to ensure that data stored in your tenant is never returned to the user or used by an LLM if the user doesn't have access to that data. When the data has sensitivity labels applied, there's an extra layer of protection:
When a file is open in Word, Excel, PowerPoint, or similarly an email or calendar event is open Outlook, the sensitivity of the data is displayed to users in the app with the label name and content markings (such as header or footer text) that have been configured for the label.
When the sensitivity label applies encryption, users must have the EXTRACT usage right, as well as VIEW, for Copilot to return the data.
This protection extends to data stored outside your Microsoft 365 tenant when it's open in an Office app (data in use). For example, local storage, network shares, and cloud storage.
Tip
If you haven't already, we recommend you enable sensitivity labels for SharePoint and OneDrive and also familiarize yourself with the file types and label configurations that these services can process. When sensitivity labels aren't enabled for these services, the encrypted files that Copilot for Microsoft 365 can access are limited to data in use from Office apps on Windows.
For instructions, see Enable sensitivity labels for Office files in SharePoint and OneDrive.
Additionally, when you use Copilot's Microsoft 365 Chat feature that can access data from a broad range of content, the sensitivity of labeled data returned by Copilot for Microsoft 365 is made visible to users with the sensitivity label displayed for citations and the items listed in the response. Using the sensitivity labels' priority number that's defined in the Microsoft Purview compliance portal, the latest response in Microsoft 365 Chat displays the highest priority sensitivity label from the data used for that Copilot chat.
Although compliance admins define a sensitivity label's priority, a higher priority number usually denotes higher sensitivity of the content, with more restrictive permissions. As a result, Copilot responses are labeled with the most restrictive sensitivity label.
Note
If items are encrypted by Microsoft Purview Information Protection but don't have a sensitivity label, Microsoft Copilot for Microsoft 365 also won't return these items to users if the encryption doesn't include the EXTRACT or VIEW usage rights for the user.
If you're not already using sensitivity labels, see Get started with sensitivity labels.
Although DLP policies don't yet support interactions for Microsoft Copilot for Microsoft 365, data classification for sensitive info types and trainable classifiers are supported to identify sensitive data in user prompts to Copilot, and responses.
Copilot protection with sensitivity label inheritance
When you use Copilot to create new content based on an item that has a sensitivity label applied, the sensitivity label from the source file is automatically inherited, with the label's protection settings.
For example, a user selects Draft with Copilot in Word and then Reference a file. Or a user selects Create presentation from file in PowerPoint. The source content has the sensitivity label Confidential\Anyone (unrestricted) applied and that label is configured to apply a footer that displays "Confidential". The new content is automatically labeled Confidential\Anyone (unrestricted) with the same footer.
If multiple files are used to create new content, the sensitivity label with the highest priority is used for label inheritance.
As with all automatic labeling scenarios, the user can always override and replace an inherited label (or remove, if you're not using mandatory labeling).
Microsoft Purview protection without sensitivity labels
Even if a sensitivity label isn't applied to content, many services and products use the encryption capabilities from Azure Information Protection and the Azure Rights Management service. As a result, Copilot for Microsoft 365 can still check for the VIEW and EXTRACT usage rights before returning data and links to a user, but there's no automatic inheritance of protection for new items.
Examples of products and services that can use the encryption capabilities from Azure Information Protection without sensitivity labels:
- Microsoft Purview Message Encryption
- Microsoft Information Rights Management (IRM)
- Microsoft Rights Management connector
- Microsoft Rights Management SDK
For other encryption methods that don't use Azure Information Protection:
S/MIME protected emails won't be returned by Copilot, and Copilot isn't available in Outlook when an S/MIME protected email is open.
Password-protected documents can't be accessed by Copilot for Microsoft 365 unless they're already opened by the user in the same app (data in use). Passwords aren't inherited by a destination item.
As with other Microsoft 365 services, such as eDiscovery and search, items encrypted with Microsoft Purview Customer Key are supported and eligible to be returned by Copilot for Microsoft 365.
Copilot honors existing protection with the EXTRACT usage right
Although you might not be very familiar with the individual usage rights for encrypted content, they've been around a long time. From Windows Server Rights Management, to Active Directory Rights Management, to the cloud version that became Azure Information Protection with the Azure Rights Management service.
If you've ever received a "Do Not Forward" email, it's using usage rights to prevent you from forwarding the email after you've been authenticated. As with other bundled usage rights that map to common business scenarios, a Do Not Forward email grants the recipient usage rights that control what they can do with the content, and it doesn't include the FORWARD usage right. In addition to not forwarding, you can't print this Do Not Forward email, or copy text from it. The usage right that grants permission to copy text is EXTRACT, with the more user-friendly, common name of Copy. It's this usage right that determines whether Copilot for Microsoft 365 can display text to the user from encrypted content.
When you use the Microsoft Purview compliance portal to configure a sensitivity label to apply encryption, the first choice is whether to assign the permissions now, or let users assign the permissions. If you assign now, you configure the permissions by either selecting a predefined permission level with a preset group of usage rights, such as Co-Author or Reviewer. Or, you can select custom permissions where you can individually select available usage rights.
In the Microsoft Purview compliance portal, the EXTRACT usage right is displayed as Copy and extract content(EXTRACT). For example, the default permission level selected is Co-Author, where you see Copy and extract content(EXTRACT) is included. As a result, content protected with this encryption configuration can be returned by Copilot for Microsoft 365:
Note
The person applying the encryption always has the EXTRACT usage right, because they are the Rights Management owner. This special role automatically includes all usage rights, which means that content a user has encrypted themselves is always eligible to be returned to them by Copilot for Microsoft 365. The configured usage restrictions apply to other people who are authorized to access the content.
If you select the encryption configuration to let users assign permissions, for Outlook, this configuration includes predefined permissions for Do Not Forward and Encrypt-Only. The Encrypt-Only option, unlike Do Not Forward, does include the EXTRACT usage right.
When you select custom permissions for Word, Excel, and PowerPoint, users select their own permissions in the Office app when they apply the sensitivity label. They're informed that from the two selections, Read doesn't include the permission to copy content, but Change does. These references to copy refer to the EXTRACT usage right. If the user selects More Options, they can add the EXTRACT usage right to Read by selecting Allow users with read access to copy content.
Tip
If you need to check whether a document you're authorized to view includes the EXTRACT usage right, open it in the Office app and customize the status bar to show Permissions. Select the icon next to the sensitivity label name to display My Permission. View the value for Copy, which maps to the EXTRACT usage right, and confirm whether it displays Yes or No.
For emails, if the permissions aren't displayed at the top of the message, select the information banner with the label name, and then select View Permission.
For more information about configuring a sensitivity label for encryption, see Restrict access to content by using sensitivity labels to apply encryption.
For technical details about the usage rights, see Configure usage rights for Azure Information Protection.
Microsoft Purview supports compliance management for Copilot
Use Microsoft Purview compliance capabilities to support your risk and compliance requirements for Copilot for Microsoft 365.
Interactions with Copilot can be monitored for each user in your tenant. As such, you can use Purview's classification (sensitive info types and trainable classifiers), content search, communication compliance, auditing, eDiscovery, and automatic retention and deletion capabilities by using retention policies.
For communication compliance, all new and existing policies that detect Teams messages will automatically detect interactions for Copilot. This makes it possible to analyze user prompts and Copilot responses to detect inappropriate or risky interactions or sharing of confidential information. For more information, see the Microsoft Copilot for Microsoft 365 section in Detect channel signals with communication compliance.
For auditing, details are captured when users interact with Copilot. Events include how and when users interact with Copilot, in which Microsoft 365 service the activity took place, and references to the files stored in Microsoft 365 that were accessed during the interaction. If these files have a sensitivity label applied, that's also captured. In the Audit solution from the Microsoft Purview compliance portal, select Copilot activities and Interacted with Copilot. You can also select Copilot as a workload.
For content search, because user prompts to Copilot and responses from Copilot are stored in a user's mailbox, they can be searched and retrieved when the user's mailbox is selected as the source for a search query. Select and retrieve this data from the source mailbox by selecting Add condition > Type > Copilot interactions.
Similarly for eDiscovery, you use the same query process to select mailboxes and retrieve user prompts to Copilot and responses from Copilot. After the collection is created and sourced to the review phase in eDiscovery (Premium), this data is available for performing all the existing reviewing actions. These collections and review sets can then further be put on hold or exported. If you need to delete this data, see Search for and delete data for Microsoft Copilot for Microsoft 365.
For retention policies that support automatic retention and deletion, user prompts to Copilot and responses from Copilot are identified by the location Teams chats and Copilot interactions. Previously named just Teams chats, users don't need to be using Teams chat for this policy to apply to them. Any existing retention policies previously configured for Teams chats now automatically include user prompts and responses to and from Microsoft Copilot for Microsoft 365:
For detailed information about this retention works, see Learn about retention for Microsoft Copilot for Microsoft 365.
As with all retention policies and holds, if more than one policy for the same location applies to a user, the principles of retention resolve any conflicts. For example, the data is retained for the longest duration of all the applied retention policies or eDiscovery holds.
For retention labels to automatically retain files referenced in Copilot, select the option for cloud attachments with an auto-apply retention label policy: Apply label to cloud attachments and links shared in Exchange, Teams, Viva Engage, and Copilot. As with all retained cloud attachments, the file version at the time it's referenced is retained.
For detailed information about how this retention works, see How retention works with cloud attachments.
For configuration instructions:
To configure communication compliance policies for Copilot interactions, see Create and manage communication compliance policies.
To search the audit log for Copilot interactions, see Audit New Search.
To use content search to find Copilot interactions, see Search for content.
To use eDiscovery for Copilot interactions, see Microsoft Purview eDiscovery solutions.
To create or change a retention policy for Copilot interactions, see Create and configure retention policies.
To create an auto-apply retention label policy for files referenced in Copilot, see Automatically apply a retention label to retain or delete content.
Other documentation for Copilot
For more detailed information, see Considerations for deploying Microsoft Purview data security and compliance protections for Copilot.
To learn more about Copilot for Microsoft 365 and how your organization can use this copilot for work, see the Microsoft Copilot for Microsoft 365 documentation.
To learn how to apply Zero Trust to Copilot for Microsoft 365, see Apply principles of Zero Trust to Microsoft Copilot for Microsoft 365.
Feedback
Submit and view feedback for