Why is there mismatch between the json output supported models in practice and in documentation?

Debjit Kar 0 Reputation points
2025-02-28T07:57:35.2733333+00:00

In documentation here, gpt-4o-mini ver 2024-07-18 is supposed to support json output.
On ai foundry, the only available version for deployment is the gpt-4o-mini ver 2024-07-18 in every region, the tag being "gpt-4o-mini".
But this version doesn't support json output when requested from api. The error message mentions that "response_format value as json_schema is enabled only for api versions 2024-08-01-preview and later".
However, this version isn't available for deployment on azure foundry.
So how can I get guaranteed json output from gpt-4o-mini deployed on azure foundry?

Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
3,602 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Saideep Anchuri 9,425 Reputation points Microsoft External Staff Moderator
    2025-03-02T12:05:07.1433333+00:00

    Hi Debjit Kar

    The version of the gpt-4o-mini model you're using (2024-07-18) doesn't support JSON output, as the error message indicates. The documentation says that JSON output is only available for API versions 2024-08-01-preview and later. Since this version isn't available on Azure Foundry yet, you can't get JSON output from your current model version. To get JSON output, you'll need to wait for a compatible version to be available or check for updates from Azure What'snew.

    Kindly refer below link: Howto use JSON mode

    gpt-4o

    Thank You.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.