편집

다음을 통해 공유


Data, privacy, and security considerations for extending Microsoft 365 Copilot

When you extend Microsoft 365 Copilot with agents, queries based on your prompts, conversation history, and Microsoft 365 data can be shared with the agent to generate a response or complete a command. When you extend Microsoft 365 Copilot with Microsoft 365 Copilot connectors (formerly Microsoft Graph connectors), your external data is ingested into Microsoft Graph and remains in your tenant. This article outlines data privacy and security considerations for developing different Copilot extensibility solutions, both in-house and as a commercial developer.

Diagram key considerations for developing Copilot extensibility: Enterprise security and trust, Responsible AI, High-quality user experience, High-value functionality

Agents and actions

Agents in Microsoft 365 Copilot are individually governed by their terms of use and privacy policies. As an agent and actions (plugin) developer, you're responsible for securing your customer's data within the bounds of your service and providing information on your policies regarding users' personal information. Admins and users can then view your privacy policy and terms of use in the app store before choosing to add or use your agent.

When you integrate your business workflows as agents for Copilot, your external data stays within your app; it doesn't flow into Microsoft Graph and it isn't used to train Microsoft 365 Copilot LLMs. Copilot does, however, generate a search query to send to your agent on the user's behalf based on their prompt and conversation history with Copilot and data the user has access to in Microsoft 365.

Ensuring a secure implementation of declarative agents in Microsoft 365

Microsoft 365 customers and partners can build declarative agents that extend Microsoft 365 Copilot with custom instructions, grounding knowledge, and actions invoked via REST API descriptions configured by the declarative agent. At runtime, Microsoft 365 Copilot reasons over a combination of the user's prompt, custom instructions that are part of the declarative agent, and data provided by custom actions. All of this data might influence the behavior of the system, and such processing comes with security risks. Specifically, if a custom action can provide data from untrusted sources (such as emails or support tickets), an attacker might be able to craft a message payload that causes your agent to behave in a way that the attacker controls, such as incorrectly answering questions or even invoking custom actions. Microsoft takes many measures to prevent such attacks. In addition, organizations should only enable declarative agents that use trusted knowledge sources and connect to trusted REST APIs via custom actions. If the use of untrusted data sources is necessary, design the declarative agent with the possibility of breach in mind and don't give it the ability to perform sensitive operations without careful human intervention.

Microsoft 365 provides organizations with extensive controls that govern who can acquire and use integrated apps and which apps are enabled for groups or individuals within a Microsoft 365 tenant, including apps that use declarative agents. Tools like Copilot Studio, which enable users to create their own declarative agents, also include extensive controls that allow admins to govern connectors used for both knowledge and custom actions.

Copilot connectors

Microsoft 365 Copilot presents only data that each individual can access using the same underlying controls for data access used in other Microsoft 365 services. Microsoft Graph honors the user identity-based access boundary so that the Copilot grounding process only accesses content that the current user is authorized to access. This is also true of external data within Microsoft Graph ingested from a Copilot connector.

When you connect your external data to Copilot with a Copilot connector, your data flows into Microsoft Graph. You can manage permissions to view external items by associating an access control list (ACL) with a Microsoft Entra user and group ID or an external group.

Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including those used by Microsoft 365 Copilot.

Considerations for line-of-business developers

Microsoft 365 Copilot only shares data with and searches in agents or connectors that are enabled for Copilot by a Microsoft 365 admin. As a line-of-business developer of Copilot extensibility solutions, be sure that you and your admin are familiar with:

Considerations for independent software publishers

Submission of your app package to the Microsoft Partner Center Microsoft 365 and Copilot program requires meeting certification policies for acceptance to Microsoft 365 in-product stores. Microsoft certification policies and guidelines regarding privacy, security, and responsible AI include: