An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
Does Azure OpenAI support “Extended Prompt Cache Retention”?
Hello,
I would like to confirm whether Azure OpenAI Service supports the “Extended Prompt Cache Retention” feature described in the OpenAI documentation:
https://developers.openai.com/api/docs/guides/prompt-caching#extended-prompt-cache-retention
According to the OpenAI documentation, models such as gpt-5.2 support the prompt_cache_retention parameter for extended prompt cache retention.
However, when I attempt to use the same parameter with the gpt-5.2 deployment in Azure OpenAI, I receive the following error:
APIError: prompt_cache_retention is not supported on this model
x-request-id: 79f14e6e-ba6a-42ec-bb19-7d83cb6653f8
This raises a few questions:
- Does Azure OpenAI currently support the Extended Prompt Cache Retention feature?
- If not, is this feature planned for future support?
- If yes, are there specific API versions, regions, or model deployment configurations required?
Any clarification would be greatly appreciated.
Thank you in advance for your support.