Privacy and protections

Commercial data protection

When organizations and employees use generative AI services, it's important to understand how these services handle user and chat data. Because employee chats may contain sensitive data, Copilot is designed to protect this information, as illustrated here:

Diagram of Copilot architecture.

This is how commercial data protection works in Copilot:

  • Copilot uses Microsoft Entra ID (formerly known as Azure Active Directory) for authentication and only allows users to access Copilot with commercial data protection using their work account.
  • An Entra ID user's tenant and user information is removed from chat data at the start of a chat session. This information is only used to determine if the user is eligible for commercial data protection. Search queries triggered by prompts from an Entra ID user aren't linked to users or organizations by Bing.
  • Microsoft doesn't retain prompts or responses from Entra ID users when using Copilot. Prompts and responses are maintained for a short caching period for runtime purposes. After the browser is closed, the chat topic is reset, or the session times out, Microsoft discards prompts and responses.
  • Chat data sent to and from Copilot with commercial data protection is encrypted in transit (TLS 1.2+) and at rest (AES-128) during the chat session. Microsoft has no 'eyes-on' access to it.
  • Because Microsoft doesn't retain prompts and responses, they can't be used as part of a training set for the underlying large language model.
  • Advertising shown to Entra ID users isn't targeted based on workplace identity or chat history.

These data protections extend to eligible Entra ID user chats in Copilot on copilot.microsoft.com and in Bing, Edge, and Windows. They also extend to Copilot chats in the Copilot, Bing, Edge, Microsoft Start or Microsoft 365 mobile apps.

Chat history and reporting

When commercial data protection is enabled, Copilot doesn't support the chat history feature. It doesn't retain chat prompts or responses.

It also offers no usage reporting or auditing capabilities to organizations. Copilot users may be subject to other methods of monitoring available to IT admins in their organization, such as internal logging, device or network logs, and other methods on their company network or devices.

Copilot is managed in accordance with our responsible AI principles, which means we take steps to mitigate misuse or harmful behavior and content.

Organizational data

Copilot is a generative AI service grounded in data from the public web in the Bing search index only. It doesn't have access to organizational resources or content within Microsoft 365, such as documents in OneDrive, emails, or other data in the Microsoft 365 Graph.

Copilot for Microsoft 365 is required if your organization wants a chat experience grounded in work data inside your tenant boundary.

Copilot can access organizational content in the chat only when it's provided by users. This can be done in one of two ways:

  1. Users explicitly type or paste this information directly into the chat.
  2. Users type a prompt into Copilot in Edge after enabling the 'Allow access to any webpage or PDF' setting, and an intranet page is open in the browser. In this scenario, Copilot may use this content to help answer questions.

In both cases, when commercial data is enabled, Copilot doesn't retain any of this data after the chat session is over.

Microsoft as the data controller

Copilot is a connected service where Microsoft is the data controller. Users' prompts leave your organization's Microsoft 365 tenant boundary to reach the Copilot service. When commercial data protection is enabled, Microsoft doesn't retain this data beyond a short caching period for runtime purposes. After the browser is closed, the chat topic is reset, or the session times out, Microsoft discards all prompts and responses.

To provide chat responses, Copilot uses global data centers for processing and may process data in the United States. Optional, Bing-backed connected experiences don't fall under Microsoft's EU Data Boundary (EUDB) commitment. Learn more: Continuing Data Transfers that apply to all EU Data Boundary services. They also don't fall under the terms of the Data Protection Addendum (DPA) which requires company data to remain inside geographic or tenant boundaries.

As a reminder, Copilot has no access to organizational data inside your tenant boundary, and chat conversations aren't saved or used to train the underlying models.

Organizations with strict requirements that data must remain inside tenant or geographic boundaries should instead consider Copilot for Microsoft 365 or Azure Open AI to provide generative AI services. Copilot with commercial data protection is intended as a more secure alternative for organizations than using consumer-oriented generative AI services.

For more information, see Microsoft 365 Data Residency and the Microsoft Privacy Statement.

Authentication and authorization

Commercial data protection is only available by signing in with the same Entra ID used to access Microsoft 365 services such as SharePoint or Outlook.

GDPR

The May 21, 2018, blog post from Microsoft outlines our commitment to GDPR compliance and how Microsoft helps businesses and other organizations meet their own GDPR obligations. You can find more details in the Microsoft Trust Center FAQ.

Copilot aligns with GDPR principles. Customers who wish to submit a right to be forgotten request to remove information from the Bing search index can do so here: Bing - Request Form to Block Search Results in Europe

Advertising

Copilot occasionally shows advertisements as part of chat responses. An ad that appears in a chat response is triggered by any queries generated by the user’s prompt, not their workplace identity.

Advertising to Entra ID users isn't targeted, meaning no information from the user's workplace identity is used to determine the ad that appears. Entra ID users won't be retargeted by ads they previously interacted with in Copilot.