Edit

Share via


Supported programming languages for Azure AI Inference SDK

Note

This document refers to the Microsoft Foundry (classic) portal.

🔍 View the Microsoft Foundry (new) documentation to learn about the new portal.

Important

If you're currently using an Azure AI Inference beta SDK with Microsoft Foundry Models or Azure OpenAI service, we strongly recommend that you transition to the generally available OpenAI/v1 API, which uses an OpenAI stable SDK.

For more information on how to migrate to the OpenAI/v1 API by using an SDK in your programming language of choice, see Migrate from Azure AI Inference SDK to OpenAI SDK.

All models deployed to Microsoft Foundry Models support the Azure AI Model Inference API and its associated family of SDKs.

To use these SDKs, connect them to the Azure AI model inference URI (usually in the form https://<resource-name>.services.ai.azure.com/models).

Azure AI Inference package

The Azure AI Inference package allows you to consume all models deployed to the Foundry resource and easily switch the model deployment from one to another. The Azure AI Inference package is part of the Microsoft Foundry SDK.

Language Documentation Package Examples
C# Reference azure-ai-inference (NuGet) C# examples
Java Reference azure-ai-inference (Maven) Java examples
JavaScript Reference @azure/ai-inference (npm) JavaScript examples
Python Reference azure-ai-inference (PyPi) Python examples

Integrations

Framework Language Documentation Package Examples
LangChain Python Reference langchain-azure-ai (PyPi) Python examples
Llama-Index Python Reference llama-index-llms-azure-inference (PyPi)
llama-index-embeddings-azure-inference (PyPi)
Python examples
Semantic Kernel Python Reference semantic-kernel[azure] (PyPi) Python examples
AutoGen Python Reference autogen-ext[azure] (PyPi) Quickstart

Limitations

Foundry doesn't support the Cohere SDK or the Mistral SDK.

Next step