Summary

Completed

In this module, you learned about Microsoft 365 Copilot, which aligns with Microsoft's current commitments to data security and privacy in the enterprise. Microsoft 365 Copilot adheres to all existing privacy and compliance obligations and unlocks business value. It does so by presenting Large Language Models (LLMs) with prompts containing retrieved customer business data. Microsoft 365 Copilot accesses content and context through Microsoft Graph, consistent with the permissions of the individual using Microsoft 365. It combines retrieved information with the user's working context, Microsoft 365 Copilot session context, and prompts to help deliver accurate, relevant, contextual responses. Microsoft 365 Copilot uses an organization's proprietary business data from Microsoft 365 services to generate responses that are personalized to a user's business context.

This module examined how Microsoft 365 Copilot is designed with security, compliance, and privacy in mind. It follows foundational principles such as:

  • Being built on Microsoft's comprehensive approach to security, compliance, and privacy.
  • Being architected to protect tenant, group, and individual data.
  • Being committed to responsible AI.

Microsoft 365 Copilot only accesses data that individual users have at least View permissions to access within Microsoft 365 services like SharePoint, OneDrive, and Teams. Each Microsoft 365 tenant has its own isolated Microsoft 365 Copilot orchestrator instance, which keeps all accessed data within the organization's compliance boundary.