Hi there Taylor Nelson
Thanksn for using QandA platform
Yup i hve seen it too, OpenAI Deployments API is supposed to return maxContextToken
and maxOutputToken
, some deployments like gpt-4o
or gpt-4.1
may not include those fields, even though they’re present for gpt-3.5
or gpt-4
. likely due to backend deployment schema differences hope this will be addresred in the next rollout
The best workaround is to refer to the official model documentation for token limits or test them programmatically.
if this helps kindly accept the answer thanks much.