Supported programming languages for models in Azure AI Model Inference
Article
Models deployed in Azure AI Model Inference can be used with different SDKs and programming models. The following document describes which one to use:
All models
All models deployed to Azure AI model inference support the Azure AI model inference API and its associated family of SDKs.
To use these SDKs, connect them to the Azure AI model inference URI (usually in the form https://<resource-name>.services.ai.azure.com/models).
Azure AI Inference package
The Azure AI Inference package allows you to consume all models deployed to the Azure AI model inference service and easily change among them. Azure AI Inference package is part of the Azure AI Foundry SDK.
The Azure AI Projects package allows customer to access a comprehensive set of functionalities from an Azure AI project. Those capabilities include Azure AI model inference, but also advanced capabilities like tracing, evaluation, and data storage. Azure AI Projects package is part of the Azure AI Foundry SDK and leverages the Azure AI Inference package and Azure OpenAI package to perform inference depending on users needs.