Formerly known as Azure AI Services or Azure Cognitive Services is a unified collection of prebuilt AI capabilities within the Microsoft Foundry platform
Thank you for reaching out on the Microsoft Q&A.
If you can’t see your GPT‑5.2 deployment while creating an agent or playground in Azure AI Foundry, the issue usually relates to permissions, connection scope, API type, or region/quota. Below are the steps to troubleshoot.
Verify the deployment and model eligibility
First, confirm that the deployment exists in your Azure OpenAI resource and that you deployed the correct model (e.g., gpt‑5.2 or gpt‑5.2‑chat). Check that the deployment completed successfully. GPT‑5.2 availability can be gated, so ensure your tenant has access and registration if required.
Check RBAC roles
Ensure you have the right roles on the Azure OpenAI resource. You need either Cognitive Services OpenAI User (for inference) or Contributor (for managing deployments). Assign these roles via Azure Portal → OpenAI resource → Access control (IAM).
Add or fix the Foundry connection
In your Foundry project, go to Manage → Connections → + New connection → Azure OpenAI and select the resource where GPT‑5.2 is deployed. Test the connection, then return to Playgrounds/Agents and choose From connections to pick the deployment.
Match the API type
Foundry Agents expect deployments that support the Responses API. If your deployment uses the older chat-completions API, it may not appear. Use the Responses endpoint (…/openai/v1/) and specify the deployment name in your agent configuration.
Region and quota checks
GPT‑5.x models are region-specific. If your resource is in a region without GPT‑5.2 quota, the deployment won’t show. Consider deploying in a supported region or requesting quota. Also, allow a few minutes after creating the deployment for it to propagate.
Common blockers and quick fixes
- Make sure you’re signed into the correct tenant.
- Check that the Foundry project subscription matches the OpenAI resource subscription.
- If using Private Link or firewall restrictions, allow Foundry’s outbound traffic or configure DNS rules.
Agent-specific notes
Agents in Foundry integrate with OpenAI deployments via the Responses pipeline. Select the correct deployment name (e.g., gpt-5.2-chat) in your agent settings. If you added extra tools like MCP and the deployment disappears, remove them temporarily to isolate the issue.
Quick validation steps
- Deploy GPT‑5.2 from the Foundry model catalog to your OpenAI resource.
- Confirm RBAC roles.
- Create and test the Azure OpenAI connection in Foundry.
- Refresh the project and select the deployment in Playgrounds or Agents.
References
- Foundry model catalog deployment guide
- https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/role-based-access-control?view=foundry-classic
- https://learn.microsoft.com/en-us/azure/ai-foundry/how-to/connections-add?view=foundry-classic
Please let me know if there are any remaining questions or additional details, I can help with, I’ll be glad to provide further clarification or guidance.
Thankyou!