Edit

Share via


Responsible AI FAQ: Copilot in Microsoft Entra

This FAQ provides answers to common questions about Responsible AI as it relates to Security Copilot in Microsoft Entra. Learn how this AI-powered security solution enhances the efficiency and capabilities of IT and security professionals to improve security outcomes.

What is Copilot in Microsoft Entra?

Microsoft Security Copilot is a natural language, generative AI-powered security solution that helps increase the efficiency and capabilities of IT and Security professionals to improve security outcomes at machine speed and scale. It draws context from plugins and data to answer prompts so that security professionals and IT admins can help keep their organizations secure.

What can Copilot in Microsoft Entra do?

Security Copilot embedded in Microsoft Entra helps answer questions in natural language so that you can receive actionable responses to common tasks related to identity and access management. 

It helps in the following scenarios:

  • Sign-in troubleshooting
  • Inspect sign-in logs, uncover the cause of failed sign-ins including policies evaluated for MFA and Conditional Access.
  • Identity Protection for users and workload identities
  • Identify and mitigate risks of compromise for users, service principals, and workload identities
  • Identity administration
  • Find user account information, group ownership and membership details, and changes to users, apps, groups, and roles from Microsoft Entra audit logs.

What are Security Copilot in Microsoft Entra’s intended uses?

Security Copilot embedded in Microsoft Entra is intended for use by identity and access administrators. Ask questions about Microsoft Entra data and documentation, find and summarize details about users, groups, apps, sign-ins, and changes to those objects.

How was Security Copilot in Microsoft Entra evaluated? What metrics are used to measure performance?

Security Copilot underwent substantial testing prior to being released. Testing included red teaming, which is the practice of rigorously testing the product to identify failure modes and scenarios that might cause Security Copilot to do or say things outside of its intended uses or that don't support the Microsoft AI Principles.

Now that it's released, user feedback is critical in helping Microsoft improve the system. You have the option of providing feedback whenever you receive output from Security Copilot embedded in Microsoft Entra. When a response is inaccurate, incomplete, or unclear, give it a thumbs down and indicate one or more categories to flag any objectionable output. You can also confirm when responses are useful and accurate by giving it a thumbs up. These buttons appear at the bottom of every Security Copilot response and your feedback goes directly to Microsoft to help us improve.

What are the limitations of Security Copilot embedded in Microsoft Entra? How can users minimize the impact of Security Copilot in Microsoft Entra’s limitations when using the system?

Preview features aren’t meant for production use and might have limited functionality.

Like any AI-powered technology, Security Copilot doesn’t get everything right. However, you can help improve its responses by providing your observations using the feedback tool, which is built into the platform.

The system is designed to respond to prompts related to identity and access administration. Prompts outside the scope of Microsoft Entra might result in responses that lack accuracy and comprehensiveness.

The system might not be able to process long prompts, such as hundreds of thousands of characters.

Use of Security Copilot embedded in Microsoft Entra might be subject to usage limits or capacity throttling. Even short prompts can take time (up to several minutes) and require a high number of security consumption units.

What operational factors and settings allow for effective and responsible use of Security Copilot in Microsoft Entra?

You can use everyday words to describe what you’d like Security Copilot to do. For example: Find this user or Who owns this group?

You can also choose from a set of prompts provided in Security Copilot in Microsoft Entra and select from a set of suggested prompts to continue a conversation.

You can provide feedback about a response, including reporting anything unacceptable to Microsoft.

How do I provide feedback on Security Copilot embedded in Microsoft Entra?

You have the option of providing feedback whenever you receive output from Security Copilot embedded in Microsoft Entra. When a response is inaccurate, incomplete, or unclear, give it a thumbs down and indicate one or more categories to flag any objectionable output. You can also confirm when responses are useful and accurate by giving it a thumbs up. These buttons appear at the bottom of every Security Copilot response and your feedback goes directly to Microsoft to help us improve.