How to add APIs of serverless llm deployments(on AI foundary) to Azure API management service?

Vivek Kumar 45 Reputation points
2025-03-19T13:50:12.3866667+00:00

I am adding multiple APIs in Azure API management service instance(Consumption plan) using Azure portal by option Add API>Create from definition>OpenAPI. Now in that i am able to add azure openai apis using swagger or inference.json from azure-rest-api-specs
Now for adding api of serverless llm deployment like Meta-Llama-3-8B-Instruct, after deploying the model in AI Foundary we get the swagger.json for the deployment as a url. I downloaded it and used the same for adding the api, but got this error

Parsing error(s): redact is not a valid property at #/components/schemas/ChatCompletionRequest/properties/messages [] redact is not a valid property at #/components/schemas/ChatCompletionRequest/properties/stop [] redact is not a valid property at #/components/schemas/ChatCompletionRequest/properties/tools [] redact is not a valid property at #/components/schemas/ChatCompletionRequest/properties/user [] redact is not a valid property at #/components/schemas/ChatCompletionRequestOssVllm/properties/messages [] redact is not a valid property at #/components/schemas/ChatCompletionRequestOssVllm/properties/stop [] redact is not a valid property at #/components/schemas/ChatCompletionRequestOssVllm/properties/tools [] const is not a valid property at #/components/schemas/ToolType []
Parsing error(s): The input OpenAPI file is not valid for the OpenAPI specification https://github.com/OAI/OpenAPI-Specification/blob/master/versions/3.0.1.md (schema https://github.com/OAI/OpenAPI-Specification/blob/master/schemas/v3.0/schema.yaml).

The version of openai specification in swagger.json is openapi": "3.1.0" (while in the Azure openai models it is 3.0.0) . I also tried removing those redact and const lines still the api was not working. Please answer the following questions

  1. Is openai specification openapi": "3.1.0" is supported by API Management consumption plan
  2. How can I add the APIs of the serverless models(deployed in Azure AI foundary) in the APIM service. Are there any other ways

Please guide me in detail

Azure API Management
Azure API Management
An Azure service that provides a hybrid, multi-cloud management platform for APIs.
2,447 questions
{count} votes

Accepted answer
  1. Khadeer Ali 5,990 Reputation points Microsoft External Staff Moderator
    2025-03-20T17:44:31.3166667+00:00

    @Vivek Kumar ,

    As per our discussion, the issue with importing the AI Foundry API into Azure API Management (APIM) was due to unsupported properties like redact and const in the OpenAPI 3.1 specification. After removing these properties, the API was successfully added.

    Key Considerations & Solution Approach:

    Validation & Modification of OpenAPI Specification

    • The Azure APIM Consumption Plan supports OpenAPI 3.0.x (fully) and 3.1 (import-only).
    • The original swagger.json contained properties (redact, const) that were not compliant with APIM’s expected schema.
    • Removing these properties allowed successful import.

    Hope this helps. do click Accept Answer and Yes for "Was this answer helpful." And if you have any further questions, let us know.

    2 people found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.