Bug Report: Azure Open AI Assistant API Streaming Suddenly Returning 400 Errors

Noel Stieglitz 0 Reputation points
2025-02-28T20:22:22.8566667+00:00

On 2/26 around 8:22 am, the Azure Open AI Assistant API stopped allowing streaming responses from an Azure deployed Open AI gpt-4o model. Here is the error:

HTTP 400 (invalid_request_error: unsupported_model) Parameter: model Unsupported value: 'stream' does not support 'true' with this model. Supported values are: 'false'.

Model Deployment details

  • Model: gpt-4o
  • Model version: 2024-05-13 (though others as well)
  • Region: eastus2
  • Deployment Type: Global Standard

The deployed model was previously working for quite some time. I tried deploying a new gpt-4o model and had the same issue. No code or packages have changed our end that could explain the breakage. We did try upgrading the nuget package to see if that would resolve the issue. It did not.

I was able to work around this issue by deploying a gpt-4o-mini, but obviously that has other implications. I tried some of the o models as well, which as far as I can tell should support streaming. Those had the same issue as gpt-4o. Here's an example request I captured using a proxy:

POST /openai/threads/[redacted]/runs?api-version=2025-01-01-preview HTTP/1.1
Host [redacted].openai.azure.com
OpenAI-Beta	assistants=v2
Accept	application/json
User-Agent	azsdk-net-AI.OpenAI/2.2.0-beta.2 (.NET 9.0.2; Microsoft Windows 10.0.26100)
x-ms-client-request-id	[redacted]
api-key	[redacted]
Request-Context	[redacted]
Request-Id	[redacted]
traceparent	[redacted]
Content-Type	application/json
Content-Length	87

Response:


{
  "error": {
    "message": "Unsupported value: 'stream' does not support 'true' with this model. Supported values are: 'false'.",
    "type": "invalid_request_error",
    "param": "model",
    "code": "unsupported_model"
  }
}

I can share a curl request that consistently reproduces the issue as well as the redacted bits above with Azure support personnel. I tried creating a support ticket via Azure but was unsuccessful.

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
4,081 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Prashanth Veeragoni 4,930 Reputation points Microsoft External Staff Moderator
    2025-03-03T02:38:51.42+00:00

    Hi Noel Stieglitz,

    Welcome to Microsoft Q&A forum. Thank you for posting your query.

    The error message:

    "Unsupported value: 'stream' does not support 'true' with this model. Supported values are: 'false'."

    Indicates that Azure OpenAI's GPT-4o no longer supports streaming responses in its current deployment version. This could be due to a policy change by Azure OpenAI or a misconfiguration in your deployment.

    Verify Model Streaming Support

    Check Azure OpenAI’s official documentation for gpt-4o streaming support. Azure frequently updates model capabilities, and it's possible that streaming is no longer supported for gpt-4o (or was removed in a recent update).

    Run this command in Azure CLI to check your model's capabilities:

    az openai deployment list --resource-group <resource-group-name> --name <openai-service-name> --query "[].{Model: model, SupportsStreaming: capabilities.supportsStreaming}"
    

    If SupportsStreaming is false, then Azure has disabled streaming for this model.

    Since the gpt-4o model was previously working with streaming and suddenly stopped without any changes on your end, this strongly suggests that Azure OpenAI made an internal change that disabled streaming for gpt-4o. Here’s how you can investigate and resolve the issue:

    Check Azure OpenAI Service Updates

    Azure frequently updates their models, and certain capabilities (like streaming) may have been removed or temporarily disabled.

    Check Azure OpenAI Service Status:

    Go to Azure OpenAI Service Status

     Look for any updates related to gpt-4o in your region (eastus2).

    Check Azure OpenAI Model Support Page:

    Check here:  Azure OpenAI Models

    Confirm if gpt-4o still supports streaming.

    Try Re-deploying the Model

    If gpt-4o used to work with streaming but suddenly stopped:

    Delete the current gpt-4o deployment.

    Re-deploy gpt-4o in Azure OpenAI and try again.

    Since you confirmed that gpt-4o-mini still supports streaming, consider switching to:

    gpt-4-turbo (if streaming is supported)

    or

    gpt-4o-mini

    Hope this helps. Do let us know if you any further queries.   

    Thank you.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.