Summary

Completed

In this module, you learned about Microsoft Copilot for Microsoft 365, which aligns with Microsoft's current commitments to data security and privacy in the enterprise. Copilot for Microsoft 365 adheres to all existing privacy and compliance obligations and unlocks business value. It does so by presenting Large Language Models (LLMs) with prompts containing retrieved customer business data. Copilot for Microsoft 365 accesses content and context through Microsoft Graph, consistent with the permissions of the individual using Microsoft 365. It combines retrieved information with the user's working context, Copilot for Microsoft 365 session context, and prompts to help deliver accurate, relevant, contextual responses. Copilot for Microsoft 365 uses an organization's proprietary business data from Microsoft 365 services to generate responses that are personalized to a user's business context.

This module examined how Copilot for Microsoft 365 is designed with security, compliance, and privacy in mind. It follows foundational principles such as:

  • Being built on Microsoft's comprehensive approach to security, compliance, and privacy.
  • Being architected to protect tenant, group, and individual data.
  • Being committed to responsible AI.

Copilot for Microsoft 365 only accesses data that individual users have at least View permissions to access within Microsoft 365 services like SharePoint, OneDrive, and Teams. Each Microsoft 365 tenant has its own isolated Copilot for Microsoft 365 orchestrator instance, which keeps all accessed data within the organization's compliance boundary.