Secure developer AI environments with Microsoft Purview

Intermediate
Administrator
Developer
Auditor
Microsoft 365
Microsoft Purview
Microsoft Entra
Microsoft Copilot Studio
Microsoft Fabric
Foundry Tools

Microsoft Purview provides tools to secure developer AI environments by discovering apps, assessing data access, and applying appropriate protections. This includes detecting generative AI usage, assigning user risk levels, and applying dynamic enforcement based on user behavior and data sensitivity.

Learning objectives

In this module you learn how to:

  • Discover developer AI apps and assess their access to sensitive data
  • Enforce protections for Azure AI services and Entra-registered apps
  • Govern AI agents built in Copilot Studio
  • Retain and classify prompt and response content
  • Investigate risky AI usage with Insider Risk Management, Communication Compliance, and Audit
  • Apply dynamic protections using Adaptive Protection and risk-based policies

Prerequisites

To get the most from this module, you should have:

  • A basic understanding of Microsoft Purview features, like sensitivity labels, data loss prevention, and retention
  • A general understanding of Microsoft Entra app registration and identity management
  • Familiarity with how Microsoft 365 Copilot and AI services access organizational data