Data, privacy, and security considerations for extending Microsoft 365 Copilot
When you extend the list of Copilot skills with a plugin, queries based on your prompts, conversation history, and Microsoft 365 data can be shared with the plugin to generate a response or complete a command. When you extend Copilot with a Microsoft Graph connector, your external data is ingested into Microsoft Graph and remains in your tenant. This article outlines data privacy and security considerations for developing different Copilot extensibility solutions, both in-house and as a commercial developer.
Ensuring a secure implementation of Declarative Agents in Microsoft 365
Microsoft 365 customers and partners can build declarative agents that extend Microsoft 365 Copilot with custom instructions, grounding knowledge, and actions invoked via REST API descriptions configured by the declarative agent. At runtime, Microsoft 365 Copilot reasons over a combination of the user’s prompt, custom instructions that are part of the declarative agent, and data which was provided by custom actions. All of this data might influence the behavior of the system, and such processing comes with security risks, specifically that if a custom action can provide data from untrusted sources (such as emails or support tickets), an attacker might be able to craft a message payload which causes your agent to behave in a way they control – incorrectly answering questions or even invoking custom actions. While Microsoft takes many measures to prevent such attacks, organizations should only enable declarative agents that use trusted knowledge sources and connect to trusted REST APIs via custom actions. If the use of untrusted data sources is necessary, design the declarative agent around the possibility of breach and don't give it the ability to perform sensitive operations without careful human intervention.
Microsoft 365 provides organizations with extensive controls governing who can acquire and use integrated apps and the specific apps enabled for groups or individuals within a Microsoft 365 tenant, including those apps that include declarative agents. Tools like Copilot Studio, which enable users to create their own declarative agents, also include extensive controls that allow admins to govern connectors used for both knowledge and custom actions.
Microsoft Graph connectors
Microsoft 365 Copilot presents only data that each individual can access using the same underlying controls for data access used in other Microsoft 365 services. Microsoft Graph honors the user identity-based access boundary so that the Copilot grounding process only accesses content that the current user is authorized to access. This is also true of external data within Microsoft Graph ingested from a Microsoft Graph connector.
When you connect your external data to Copilot with a Microsoft Graph connector, your data flows into Microsoft Graph. You can manage permissions to view external items by associating an access control list (ACL) with a Microsoft Entra user and group ID or an external group.
Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including those used by Microsoft 365 Copilot.
Plugins
Similar to traditional Teams apps and Power Platform connectors, plugins for Microsoft Copilot are individually governed by their terms of use and privacy policies. As a plugin developer, you're responsible for securing your customer's data within the bounds of your service and providing information on your policies regarding users' personal information. Admins and users can then view your privacy policy and terms of use in the app store before choosing to add or use your plugin as a Copilot data source.
When you plug in your app to Copilot as a plugin, your external data stays within your app; it doesn't flow into Microsoft Graph or is used to train Microsoft Copilot LLMs. Copilot does, however, generate a search query to send to your plugin on the user's behalf based on their prompt and conversation history with Copilot, and data the user has access to in Microsoft 365.
Supported API plugins authentication schemes include OAuth 2.0 authorization code flow and API key.
Message extension plugins use the same authentication process for Teams message extensions.
Power Platform connector plugins use the same authentication process for custom connectors.
Considerations for line-of-business developers
Microsoft 365 Copilot only shares data with and searches in plugins or connectors that are enabled for Copilot by a Microsoft 365 admin. As a line-of-business developer of Copilot extensibility solutions, ensure you and your admin are familiar with:
- Microsoft 365 Copilot requirements
- Data, Privacy, and Security for Microsoft 365 Copilot admin documentation
- Zero Trust principles for Microsoft 365 Copilot deployment plan for applying Zero Trust principles to Microsoft Copilot
- Microsoft Admin Center procedures:
Considerations for independent software publishers
Power Platform connectors as Copilot plugins are certified and packaged in the same way as regular Power Platform Connectors. They can then be submitted to Microsoft Partner Center through the Microsoft 365 and Copilot program as a Power Platform Connector.
Message extension plugins are packaged and distributed in the same way as Teams apps that are integrated to run across the Microsoft 365 ecosystem. Microsoft Graph connectors can also be packaged and distributed in the same way as Teams apps.
Submission of your app package to the Microsoft Partner Center Microsoft 365 and Copilot program requires meeting certification policies for acceptance to Microsoft 365 in-product stores. Microsoft Commercial Marketplace certification policies and applicable Teams Store validation guidelines regarding privacy, security, and responsible AI include:
For Microsoft Graph connectors (packaged as Teams apps), submission to the Microsoft 365 and Copilot program is currently limited to verified publishers. This provides end-users and organizational admins assurance that the publisher of an app has been verified as authentic by Microsoft.