An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
GPT-5.1-mini can be used with the Chat Completions API in Azure OpenAI in Foundry, but it must be called through the correct endpoint shape and API version for chat completions.
Key points based on the available information:
- Supported Chat Completions endpoint shape
For chat completions in the Foundry Models REST API preview, the documented endpoint is:
POST https://{endpoint}/openai/deployments/{deployment-id}/chat/completions?api-version=2025-04-01-preview
Required elements:
-
endpoint:https://{your-resource-name}.openai.azure.com -
deployment-id: the name of the GPT-5.1-mini deployment -
api-version:2025-04-01-preview(for the preview chat completions API)
Headers (one of the following auth methods):
-
Authorization: Bearer {Azure_OpenAI_Auth_Token}with a token obtained forhttps://cognitiveservices.azure.com - or
api-key: {your-azure-openai-key}
If the current calls are using a different api-version, a different base URL, or a path that does not match the above, a 404 Not Found is expected. Ensure:
- The base URL is exactly
https://{your-resource-name}.openai.azure.com(no/openai/v1/suffix for this preview REST shape). - The path is
/openai/deployments/{deployment-id}/chat/completions. - The query string includes
?api-version=2025-04-01-preview.
- Relationship to Responses API
The Responses API and Chat Completions API are two different surfaces over the same models:
- Responses API: typically called via
/openai/responses(often through the Foundry SDKs likeAIProjectClientandget_openai_client().responses.create(...)). - Chat Completions API: called via
/openai/deployments/{deployment-id}/chat/completionswith the preview API version.
The same deployment (for example, a GPT-5.1-mini deployment) can be addressed via:
-
responses.create(model=<deployment-name>, ...)using the Foundry Responses client, or -
POST /openai/deployments/{deployment-name}/chat/completions?api-version=2025-04-01-previewusing the Chat Completions REST API.
No special “deployment type” toggle is documented for enabling chat completions separately from responses; the critical requirement is using the correct endpoint and API version.
- Checklist to resolve the 404
Validate the following against the current implementation:
- Endpoint
- Must be:
https://{your-resource-name}.openai.azure.com - Do not append
/openai/v1/when using the Foundry Models REST preview shape shown above.
- Must be:
- Path
- Must be:
/openai/deployments/{deployment-name}/chat/completions -
{deployment-name}must exactly match the deployment name configured in Azure OpenAI / Foundry.
- Must be:
- API version
- Use:
api-version=2025-04-01-previewfor chat completions in the Foundry Models REST API preview.
- Use:
- Authentication
- Use either:
-
Authorization: Bearer {token}where the token is obtained via:
and the scopeaz account get-access-token --resource https://cognitiveservices.azure.comhttps://cognitiveservices.azure.com/.default, or -
api-key: {your-azure-openai-api-key}.
-
- Use either:
If any of these elements are incorrect (for example, using an older api-version, a different base URL, or a mismatched deployment name), the service can return 404 Not Found even though the deployment works via the Responses API.
- When to assume chat completions are not available
From the provided documentation, chat completions are exposed via the preview endpoint above. There is no indication in the context that GPT-5.1-mini is excluded from chat completions. If, after aligning the endpoint, path, and api-version as described, the deployment still returns 404 specifically for chat completions while Responses continues to work, this would indicate a service-side or configuration issue that requires support investigation rather than a model limitation.
In that case, capture:
- Full request URL (with
api-version), - Deployment name,
- Resource name/region,
- Exact HTTP status and body,
and provide them to Azure support for further analysis.
References:
- Azure OpenAI in Microsoft Foundry Models REST API preview reference
- Azure OpenAI in Azure AI Foundry Models REST API preview reference
- Work with chat completions models
- Work with chat completions models (programming-language-dotnet)
- Use model router for Microsoft Foundry
- Azure OpenAI in Microsoft Foundry Models REST API reference