How to use structured outputs while working with threads and assistants
I want the response as a JSON object. I'm using GPT-4o-mini (2024-07-18), and in the documentation, it’s mentioned that when making API calls, I need to provide the response_format
as { type: 'json_schema', json_schema: {} }
. Where exactly should I pass the response_format
parameter? Should I include it while creating the messages, while creating the run, or when creating the thread?
Azure OpenAI Service
-
Saideep Anchuri • 1,870 Reputation points • Microsoft Vendor
2025-01-20T06:42:15.37+00:00 Welcome to Microsoft Q&A Forum, thank you for posting your query here!
To work with threads and assistants using structured outputs, you need to include the response_format parameter when creating the run. set this parameter to { "type": "json_schema", "json_schema": {...} } to ensure the model follows the JSON schema you provide.
Attached sample Json_schema for reference:
"response_format": { "type": "json_schema", "json_schema": { "name": "CalendarEventResponse", "strict": true, "schema": { "type": "object", "properties": { "name": { "type": "string" }, "date": { "type": "string" }, "participants": { "type": "array", "items": { "type": "string" } } }, "required": [ "name", "date", "participants" ], "additionalProperties": false }
Kindly refer below Documentation: api-support
Thank You.
-
Saideep Anchuri • 1,870 Reputation points • Microsoft Vendor
2025-01-21T00:14:39.8233333+00:00 -
Pushkar Gaikwad • 0 Reputation points
2025-01-21T09:23:53.32+00:00 Hi Saideep Anchuri
I’m working with the GPT-4o model and encountered an issue regarding structured output support. When I use GPT-4o-mini, it works perfectly fine with the required structured output. However, with the gpt-4o model the supported response format version seems to be 2024-08-06. But while creating the assistant, I can only select the 2024-05-13 version for Gpt-4o.
Is there a way to create an assistant using the gpt-4o model that supports structured output with the 2024-08-06 version? If not, are there any workarounds for this?
-
kothapally Snigdha • 1,260 Reputation points • Microsoft Vendor
2025-01-21T12:00:22.22+00:00 Thank you for clarifying that you are unable to find the GPT-4o version from June 8, 2024, in the UI. In that case, I recommend trying the Python SDK instead. You can modify parameters such as the model deployment name and API version to see if that resolves the issue. please refer this document.
Thankyou.
-
Saideep Anchuri • 1,870 Reputation points • Microsoft Vendor
2025-01-22T00:13:54.5133333+00:00 We haven’t heard from you on the last response and was just checking back to see if the give response was helpful.
Thank You.
-
Pushkar Gaikwad • 0 Reputation points
2025-01-23T09:53:34.18+00:00 Hi,
I’ve reviewed the documentation and confirmed that the
2024-08-06
version of GPT-4o is listed as supporting thejson_schema
response format. Despite this, I’m still encountering the error:{"error"=>{"message"=>"Invalid parameter: 'response_format' of type 'json_schema' is not supported with model version gpt-4o.", "type"=>"invalid_request_error", "param"=>"response_format", "code"=>nil}}
Here is the code I'm using:
@run = @openai_client.runs.create( thread_id: 'thread_h2BOcyZ1bhPvI3AAyXwInBBg', parameters: { assistant_id: 'asst_c2hHTTmzy8axLSwoujIaI09K', response_format: { "type": "json_schema", "json_schema": { "name": "pageblock_schema", "strict": false, "schema": { "type": "object", "properties": { "settings": { "type": "object", "additionalProperties": true } }, "required": [ "settings" ], "additionalProperties": false } } } } )
The only parameter I'm using for the
response_format
isjson_schema
, which should be compatible with the gpt-4o model version according to the documentation.Could you please confirm if there are any other prerequisites or configurations I might have missed? Or if there might be an issue specific to my setup?
-
Pavankumar Purilla • 3,235 Reputation points • Microsoft Vendor
2025-01-23T21:08:57.81+00:00 Hi Pushkar Gaikwad,
It looks like there is a discrepancy between the documentation and the code you are using. The documentation specifies that thestrict
parameter should be set totrue
, while your code has it set tofalse
. This might be causing the issue you are encountering.
Attached sample Json_schema for reference:curl -X POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_MODEL_DEPLOYMENT_NAME/chat/completions?api-version=2024-10-21 \ -H "api-key: $AZURE_OPENAI_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "messages": [ {"role": "system", "content": "Extract the event information."}, {"role": "user", "content": "Alice and Bob are going to a science fair on Friday."} ], "response_format": { "type": "json_schema", "json_schema": { "name": "CalendarEventResponse", "strict": true, "schema": { "type": "object", "properties": { "name": { "type": "string" }, "date": { "type": "string" }, "participants": { "type": "array", "items": { "type": "string" } } }, "required": [ "name", "date", "participants" ], "additionalProperties": false } } } }'
Please try using this updated configuration. Let us know if you encounter any further issues!
Sign in to comment