Microsoft Purview data security and compliance protections for generative AI apps

Microsoft 365 licensing guidance for security & compliance

Use Microsoft Purview to mitigate and manage the risks associated with AI usage, and implement corresponding protection and governance controls.

Discover, protect, and comply categories for generative AI usage and data by using Microsoft Purview.

Microsoft Purview AI Hub is currently in preview, and provides easy-to-use graphical tools and reports to quickly gain insights into AI use within your organization. One-click policies help you protect your data and comply with regulatory requirements.

Use the AI hub in conjunction with other Microsoft Purview capabilities to strengthen your data security and compliance for Microsoft Copilot for Microsoft 365:

Note

To check whether your organization's licensing plans support these capabilities, see the licensing guidance link at the top of the page. For licensing information for Microsoft Copilot for Microsoft 365 itself, see the service description for Microsoft Copilot for Microsoft 365.

Use the following sections to learn more about the AI hub and the Microsoft Purview capabilities that provide additional data security and compliance controls to accelerate your organization's adoption of Microsoft Copilot and other generative AI apps. If you're new to Microsoft Purview, you might also find an overview of the product helpful: Learn about Microsoft Purview.

For more general information about security and compliance requirements for Copilot for Microsoft 365, see Data, Privacy, and Security for Microsoft Copilot for Microsoft 365.

Microsoft Purview AI Hub provides insights, policies, and controls for AI apps

Note

Microsoft Purview AI Hub is currently in preview and subject to change.

The AI hub from the Microsoft Purview portal or the Microsoft Purview compliance portal provides a central management location to help you quickly secure data for AI apps and proactively monitor AI use. These apps include Microsoft Copilot for Microsoft 365 and AI apps from third-party large language modules (LLMs).

Example screenshot of Microsoft Purview AI Hub, analytics.

The AI hub offers a set of capabilities so you can safely adopt AI without having to choose between productivity and protection:

  • Insights and analytics into AI activity in your organization

  • Ready-to-use policies to protect data and prevent data loss in AI prompts

  • Compliance controls to apply optimal data handling and storing policies

For a list of supported third-party AI sites, such as those used for Bard and ChatGPT, see Supported AI sites by Microsoft Purview for data security and compliance protections.

How to use the AI hub

To help you more quickly gain insights into AI usage and protect your data, the AI hub provides some preconfigured policies that you can activate with a single click. Allow at least 24 hours for these new policies to collect data to display the results in the AI hub, or reflect any changes that you make to the default settings.

To get started with the AI hub, you can use either the Microsoft Purview portal or the Microsoft Purview compliance portal. You need an account that has appropriate permissions for compliance management, such as an account that's a member of the Microsoft Entra Compliance Administrator group role.

  1. Depending on the portal you're using, navigate to one of the following locations:

  2. From Analytics, review the Get started section to learn more about the AI hub, and the immediate actions you can take. Select each one to display the flyout pane to learn more, take actions, and verify your current status.

    Action More information
    Turn on Microsoft Purview Audit Auditing is on by default for new tenants, so you might already meet this prerequisite. If you do, and users are already assigned licenses for Copilot, you start to see insights about Copilot activities from the AI data analytics section further down the page.
    Install Microsoft Purview browser extension A prerequisite for third-party AI sites.
    Onboard devices to Microsoft Purview Also a prerequisite for third-party AI sites.
    Extend your insights for data discovery One-click policies for collecting information about users visiting third-party generative AI sites and sending sensitive information to them. The option is the same as the Extend your insights button in the AI data analytics section further down the page.

    For more information about the prerequisites, see Prerequisites for the AI hub.

    For more information about the preconfigured policies that you can activate, see One-click policies from the AI hub.

  3. Then, review the Recommendations section and decide whether to implement any options that are relevant to your tenant. For example:

  4. Select Policies from the AI hub where you can quickly activate default policies to help you protect sensitive data sent to third-party generative AI sites, and protect your data with sensitivity labels. For more information about these policies, see One-click policies from the AI hub. You need to wait at least a day for the reports to be populated.

    Under Recommendations, and Fortify your data security for AI select Get started to learn more, take actions, and verify your current status.

    When your policies are created, including those from Analytics, you can monitor their status from this page. To edit them, use the corresponding management solution in the portal. For example, for Control unethical behavior in AI, you can review and remediate the matches from the Communication Compliance solution.

  5. Select Activity explorer from the AI hub to see details of the data collected from your policies. Selecting View Details from any of the charts from the Analytics page or Policies page also takes you to Activity explorer.

    This more detailed information includes activity, workload and app, user, date and time, and any sensitive information types detected.

    Examples of activities include AI interaction, Classification stamped, DLP rule match, and AI visit. For more information about the events, see Activity explorer events.

    Examples of apps include Microsoft Copilot and Other AI app.

    Example screenshot of Microsoft Purview AI Hub, activity explorer showing AI activities.

For Microsoft Copilot for Microsoft 365, use these policies and insights in conjunction with additional protections and compliance capabilities from Microsoft Purview.

Microsoft Purview strengthens information protection for Copilot

Copilot uses existing controls to ensure that data stored in your tenant is never returned to the user or used by a large language model (LLM) if the user doesn't have access to that data. When the data has sensitivity labels from your organization applied to the content, there's an extra layer of protection:

  • When a file is open in Word, Excel, PowerPoint, or similarly an email or calendar event is open Outlook, the sensitivity of the data is displayed to users in the app with the label name and content markings (such as header or footer text) that have been configured for the label.

  • When the sensitivity label applies encryption, users must have the EXTRACT usage right, as well as VIEW, for Copilot to return the data.

  • This protection extends to data stored outside your Microsoft 365 tenant when it's open in an Office app (data in use). For example, local storage, network shares, and cloud storage.

Tip

If you haven't already, we recommend you enable sensitivity labels for SharePoint and OneDrive and also familiarize yourself with the file types and label configurations that these services can process. When sensitivity labels aren't enabled for these services, the encrypted files that Copilot for Microsoft 365 can access are limited to data in use from Office apps on Windows.

For instructions, see Enable sensitivity labels for Office files in SharePoint and OneDrive.

Additionally, when you use Microsoft Copilot Graph-grounded chat (formerly Microsoft 365 Chat) that can access data from a broad range of content, the sensitivity of labeled data returned by Copilot for Microsoft 365 is made visible to users with the sensitivity label displayed for citations and the items listed in the response. Using the sensitivity labels' priority number that's defined in the Microsoft Purview portal or the Microsoft Purview compliance portal, the latest response in Copilot displays the highest priority sensitivity label from the data used for that Copilot chat.

Although compliance admins define a sensitivity label's priority, a higher priority number usually denotes higher sensitivity of the content, with more restrictive permissions. As a result, Copilot responses are labeled with the most restrictive sensitivity label.

Note

If items are encrypted by Microsoft Purview Information Protection but don't have a sensitivity label, Microsoft Copilot for Microsoft 365 also won't return these items to users if the encryption doesn't include the EXTRACT or VIEW usage rights for the user.

If you're not already using sensitivity labels, see Get started with sensitivity labels.

Although DLP policies don't yet support interactions for Microsoft Copilot for Microsoft 365, data classification for sensitive info types and trainable classifiers are supported to identify sensitive data in user prompts to Copilot, and responses.

Copilot protection with sensitivity label inheritance

When you use Copilot to create new content based on an item that has a sensitivity label applied, the sensitivity label from the source file is automatically inherited, with the label's protection settings.

For example, a user selects Draft with Copilot in Word and then Reference a file. Or a user selects Create presentation from file in PowerPoint. The source content has the sensitivity label Confidential\Anyone (unrestricted) applied and that label is configured to apply a footer that displays "Confidential". The new content is automatically labeled Confidential\Anyone (unrestricted) with the same footer.

To see an example of this in action, watch the following demo from the Ignite 2023 session, Getting your enterprise ready for Microsoft 365 Copilot. The demo shows how the default sensitivity label of General is replaced with a Confidential label when a user drafts with Copilot and references a labeled file. The information bar under the ribbon informs the user that content created by Copilot resulted in the new label being automatically applied:

If multiple files are used to create new content, the sensitivity label with the highest priority is used for label inheritance.

As with all automatic labeling scenarios, the user can always override and replace an inherited label (or remove, if you're not using mandatory labeling).

Microsoft Purview protection without sensitivity labels

Even if a sensitivity label isn't applied to content, services and products might use the encryption capabilities from the Azure Rights Management service. As a result, Copilot for Microsoft 365 can still check for the VIEW and EXTRACT usage rights before returning data and links to a user, but there's no automatic inheritance of protection for new items.

Tip

You'll get the best user experience when you always use sensitivity labels to protect your data, and encryption is applied by a label.

Examples of products and services that can use the encryption capabilities from the Azure Rights Management service without sensitivity labels:

  • Microsoft Purview Message Encryption
  • Microsoft Information Rights Management (IRM)
  • Microsoft Rights Management connector
  • Microsoft Rights Management SDK

For other encryption methods that don't use the Azure Rights Management service:

  • S/MIME protected emails won't be returned by Copilot, and Copilot isn't available in Outlook when an S/MIME protected email is open.

  • Password-protected documents can't be accessed by Copilot for Microsoft 365 unless they're already opened by the user in the same app (data in use). Passwords aren't inherited by a destination item.

As with other Microsoft 365 services, such as eDiscovery and search, items encrypted with Microsoft Purview Customer Key or your own root key (BYOK) are supported and eligible to be returned by Copilot for Microsoft 365.

Microsoft Purview supports compliance management for Copilot

Use Microsoft Purview compliance capabilities to support your risk and compliance requirements for Copilot for Microsoft 365.

Interactions with Copilot can be monitored for each user in your tenant. As such, you can use Purview's classification (sensitive info types and trainable classifiers), content search, communication compliance, auditing, eDiscovery, and automatic retention and deletion capabilities by using retention policies.

For communication compliance, you can analyze user prompts and Copilot responses to detect inappropriate or risky interactions or sharing of confidential information. For more information, see Configure a communication compliance policy to detect for Copilot for Microsoft 365 interactions.

communication-compliance-microsoft-365-copilot.

For auditing, details are captured when users interact with Copilot. Events include how and when users interact with Copilot, in which Microsoft 365 service the activity took place, and references to the files stored in Microsoft 365 that were accessed during the interaction. If these files have a sensitivity label applied, that's also captured. In the Audit solution from the Microsoft Purview portal or the Microsoft Purview compliance portal, select Copilot activities and Interacted with Copilot. You can also select Copilot as a workload. For example, from the compliance portal:

Auditing options to identify user interactions with Microsoft Copilot for Microsoft 365.

For content search, because user prompts to Copilot and responses from Copilot are stored in a user's mailbox, they can be searched and retrieved when the user's mailbox is selected as the source for a search query. Select and retrieve this data from the source mailbox by selecting Add condition > Type > Copilot interactions.

Similarly for eDiscovery, you use the same query process to select mailboxes and retrieve user prompts to Copilot and responses from Copilot. After the collection is created and sourced to the review phase in eDiscovery (Premium), this data is available for performing all the existing reviewing actions. These collections and review sets can then further be put on hold or exported. If you need to delete this data, see Search for and delete data for Microsoft Copilot for Microsoft 365.

For retention policies that support automatic retention and deletion, user prompts to Copilot and responses from Copilot are identified by the location Teams chats and Copilot interactions. Previously named just Teams chats, users don't need to be using Teams chat for this policy to apply to them. Any existing retention policies previously configured for Teams chats now automatically include user prompts and responses to and from Microsoft Copilot for Microsoft 365:

Updated Teams chats retention location to include interactions for Microsoft Copilot for Microsoft 365.

For detailed information about this retention works, see Learn about retention for Microsoft Copilot for Microsoft 365.

As with all retention policies and holds, if more than one policy for the same location applies to a user, the principles of retention resolve any conflicts. For example, the data is retained for the longest duration of all the applied retention policies or eDiscovery holds.

For retention labels to automatically retain files referenced in Copilot, select the option for cloud attachments with an auto-apply retention label policy: Apply label to cloud attachments and links shared in Exchange, Teams, Viva Engage, and Copilot. As with all retained cloud attachments, the file version at the time it's referenced is retained.

Updated cloud attachments option for auto-apply retention label to include interactions for Copilot.

For detailed information about how this retention works, see How retention works with cloud attachments.

For configuration instructions:

Other documentation for the AI hub and Copilot

Blog post announcement: Secure your data to confidently take advantage of Generative AI with Microsoft Purview

For more detailed information, see Considerations for Microsoft Purview AI Hub and data security and compliance protections for Microsoft Copilot.

Microsoft Copilot for Microsoft 365: