structured output is not supported in non-openai model like deepseek, mistral etc

Bharath Mohan 0 Reputation points
2025-05-26T04:37:05.7566667+00:00

I'm using latest version of Microsoft.Extensions.AI.AzureAIInference

Whenever I set response_format I'm getting following error from server. Can you fix it?

Microsoft.Extensions.AI.LoggingChatClient: Error: GetResponseAsync failed.

Azure.RequestFailedException: response_format value as json_schema is enabled only for api versions 2024-08-01-preview and later
Status: 400 (BadRequest)
ErrorCode: BadRequest

Content:
{"error":{"code":"BadRequest","message":"response_format value as json_schema is enabled only for api versions 2024-08-01-preview and later"}}

Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
3,602 questions
{count} votes

1 answer

Sort by: Most helpful
  1. SriLakshmi C 6,010 Reputation points Microsoft External Staff Moderator
    2025-05-26T10:52:24.5833333+00:00

    Hello @Bharath Mohan,

    To address the issue you’re facing with the response_format set to json_schema in the Microsoft.Extensions.AI.AzureAIInference package, it’s important to understand the specific requirements for using this feature, as indicated by the error message you received. The error highlights that json_schema is supported only for API versions 2024-08-01-preview and later, meaning if you're using an older version, you won't be able to set response_format=json_schema.

    To use the json_schema feature, which is supported starting from the 2024-08-01-preview API version, you need to ensure that your application is using the correct API version.

    Verify the current version you're using, as it may be something like 2023-10-01 or earlier, which doesn’t support json_schema. If you find that you’re on an older version, you’ll need to upgrade to 2024-08-01-preview or later to take advantage of this feature.

    This upgrade is essential for utilizing json_schema in your requests and ensuring compatibility with the latest capabilities provided by Azure AI.

    var options = new AzureAIInferenceOptions
    {
        ApiVersion = "2024-08-01-preview"  // Use the latest API version or later
    };
    

    You can update the API version in your Azure configuration settings to ensure you're using the latest version. If necessary, update your NuGet packages to ensure compatibility with the new API version.

    If upgrading to the latest API version isn't an option, you should adjust the response_format parameter to ensure compatibility with the version you are using. For API versions prior to 2024-08-01-preview, the json_schema feature is unsupported.

    In such cases, you should either omit the response_format parameter entirely or set it to a valid alternative, such as json, which is widely supported across older versions. This adjustment will help prevent errors and allow your requests to proceed successfully with a compatible output format.

    var requestOptions = new AzureAIInferenceRequestOptions
    {
        ResponseFormat = "json"  // Use a compatible format like "json"
    };
    

    If your application needs to support both older and newer API versions, it’s a good idea to handle versioning dynamically in your code. Based on the API version, you can adjust the response_format accordingly.

    string apiVersion = GetApiVersionFromConfiguration();  // Fetch version dynamically (config/env)
    
    if (apiVersion.CompareTo("2024-08-01-preview") >= 0)
    {
        // Use json_schema when the API version supports it
        var requestOptions = new AzureAIInferenceRequestOptions
        {
            ResponseFormat = "json_schema"
        };
    }
    else
    {
        // Fall back to json or another supported format
        var requestOptions = new AzureAIInferenceRequestOptions
        {
            ResponseFormat = "json"
        };
    }
    

    When using non-OpenAI models like Mistral or similar, it's important to note that structured outputs such as json_schema are not supported.

    These models do not have built-in support for structured outputs with the response_format parameter, meaning they may not generate valid JSON outputs even if you explicitly prompt them to do so.

    As a solution, avoid using response_format=json_schema with these models.

    Instead, you can prompt the models to generate JSON-like outputs directly within the text, but you should not expect them to produce fully structured and valid JSON. This approach ensures that you get data in a format that mimics JSON, though it may require additional parsing or validation on your end.

    Please refer this Structured Outputs Documentation.

    I Hope this helps. Do let me know if you have any further queries.


    If this answers your query, please do click Accept Answer and Yes for was this answer helpful.

    Thank you!

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.