Azure API management as a gateway for AI foundry models endpoints ?

louey Bnecheikh lehocine 145 Reputation points
2025-03-19T10:48:19.28+00:00

Hi

It is possible with Azure open AI service to use Azure API management as the gateway to enable rate limiting, caching and other interesting capabilities.

Now we plan to use AI foundry models endpoints, is it possible to use API management as a gateway and use the same policies for endpoints created from AI foundry (could be openAI models or other models) ?

Azure API Management
Azure API Management
An Azure service that provides a hybrid, multi-cloud management platform for APIs.
2,447 questions
{count} votes

Accepted answer
  1. Ranashekar Guda 2,820 Reputation points Microsoft External Staff Moderator
    2025-03-20T12:15:21.82+00:00

    Hi @louey Bnecheikh lehocine,
    Yes, it is possible to use Azure API Management as a gateway for AI Foundry model endpoints. Azure API Management provides capabilities such as rate limiting, caching, and other policies that can enhance the management of APIs, including those for AI models. While the context specifically mentions Azure OpenAI Service, many of the generative AI gateway capabilities apply to other large language model (LLM) APIs as well. You can implement similar policies for endpoints created from AI Foundry models.

    For further clarification, please refer to the following documentations: Document1, Document2

    I hope this helps resolve your issue. Feel free to reach out if you have further concerns.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.