Privacy and protections

Commercial data protection

When organizations and employees use generative AI services, it's important to understand how these services handle user and chat data. Because employee chats may contain sensitive data, Bing Chat Enterprise has the following measures in place to protect this information:

  • Bing Chat Enterprise uses Microsoft Entra ID (currently known as Azure Active Directory) for authentication and only allows users to access it with their work account.
  • Tenant and user identity information is removed from chat data at the start of a chat session. This information is only used to determine if the user is eligible to access Bing Chat Enterprise. Search queries triggered by prompts aren't linked to users or organizations by Bing.
  • Microsoft doesn't retain prompts or responses from users in Bing Chat Enterprise. Prompts and responses are maintained for a short caching period for runtime purposes. After the browser is closed, the chat topic is reset, or the session times out, Microsoft discards all prompts and responses.
  • Chat data sent to and from Bing Chat Enterprise is encrypted in transit and at rest (during the chat session). Microsoft has no 'eyes-on' access to it.
  • Because Microsoft doesn't retain prompts and responses, they can't be used as part of a training set for the underlying large language model.

These data protections extend to chat in the Edge sidebar and Windows Copilot when Bing Chat Enterprise is enabled.

Chat history and reporting

Unlike the consumer Bing Chat experience, Bing Chat Enterprise doesn't support the chat history feature. Bing Chat Enterprise doesn't retain chat prompts or responses.

Bing Chat Enterprise offers no usage reporting or auditing capabilities to organizations. Users of Bing Chat Enterprise may be subject to other methods of monitoring available to IT admins in their organization such as internal logging, device or network logs, etc. on their company network or devices.

Bing Chat Enterprise is managed in accordance with our responsible AI principles, which means we take steps to mitigate misuse or harmful behavior and content.

Organizational data

Bing Chat Enterprise is a generative AI service grounded in data from the public web in the Bing search index only. It doesn't have access to organizational resources or content within Microsoft 365, such as documents in OneDrive, emails, or other data in the Microsoft 365 Graph.

Microsoft 365 Copilot is required if your organization wants a chat experience grounded in work data inside your tenant boundary.

Only organizational content provided in the chat by users is accessible to Bing Chat Enterprise. This can be done in one of two ways:

  1. Users explicitly type or copy this information directly into the chat.
  2. Users type a prompt into chat in the Edge sidebar after enabling the 'Allow access to any webpage or PDF' setting, and a document or intranet page is open in the browser. In this scenario, Bing Chat Enterprise may use this content to help answer questions.

In both cases, Bing Chat Enterprise doesn't retain any of this data after the chat session is over.

Microsoft as the data controller

Bing Chat Enterprise is a connected service where Microsoft is the data controller. Users' prompts leave your organization's Microsoft 365 tenant boundary to reach the Bing Chat Enterprise service. However, this data is encrypted in transit, and Microsoft doesn't retain this data beyond a short caching period for runtime purposes. After the browser is closed, the chat topic is reset, or the session times out, Microsoft discards all prompts and responses.

To provide chat responses, Bing Chat Enterprise uses global data centers for processing and may process data in the United States. Optional, Bing-backed connected experiences don't fall under Microsoft's EU Data Boundary (EUDB) commitment. Learn more: Continuing Data Transfers that apply to all EU Data Boundary services. They also don't fall under the terms of Enterprise Subscription Agreements (EAS) or Campus and School Agreements (CASA) which may require company data to remain inside geographic or tenant boundaries.

As a reminder, Bing Chat Enterprise has no access to organizational data inside your tenant boundary, and chat conversations aren't saved or used to train the underlying models.

Organizations with strict requirements that data must remain inside tenant or geographic boundaries should instead consider Microsoft 365 Copilot or Azure Open AI to provide generative AI services. Bing Chat Enterprise is intended as a more secure alternative for organizations than using consumer-oriented generative AI services.

For more information, see Microsoft 365 Data Residency and the Microsoft Privacy Statement.

Authentication and authorization

Users can only access Bing Chat Enterprise through a work account by signing in with the same Microsoft Entra ID they use to access Microsoft 365 services such as SharePoint or Outlook.

GDPR

The May 21, 2018, blog post from Microsoft outlines our commitment to GDPR compliance and how Microsoft helps businesses and other organizations meet their own GDPR obligations. You can find more details in the Microsoft Trust Center FAQ.

Bing Chat Enterprise aligns with GDPR principles. Customers who wish to submit a right to be forgotten request to remove information from the Bing search index can do so here: Bing - Request Form to Block Search Results in Europe

Advertising

Advertising shown on Bing Chat Enterprise isn't targeted based on your workplace identity or chat history.