FAQ for Azure AI containers

General questions

What is available?

Azure AI containers allow developers to use the same intelligent APIs that are available in Azure, but with the benefits of containerization. Some containers are available as a gated preview that might require an application to access. Other containers are publicly available as an ungated preview, or are generally available. You can find a full list of containers and their availability in the Container support in Azure AI services article. You can also view the containers in the Microsoft Container Registry (MCR).

Is there any difference between the Azure AI services cloud and the containers?

Azure AI containers are an alternative to the Azure AI services cloud. Containers offer the same capabilities as the corresponding cloud services. Customers can deploy the containers on-premises or in Azure. The core AI technology, pricing tiers, API keys, and API signature are the same between the container and the corresponding cloud services. There are features and benefits for choosing containers over their cloud service equivalent.

How do I access and use a gated preview container?

Previously, gated preview containers were hosted on the containerpreview.azurecr.io repository. As of September 22, 2020, these containers are hosted on the Microsoft Container Registry. Downloading them doesn't require you to use the docker login command. You can run a gated preview container if your Azure resource was created with the approved Azure subscription ID. You won't be able to run the container if your Azure subscription hasn't been approved after completion of the request form.

Will containers be available for all Azure AI services offerings, and what's the next set of containers I should expect?

We'd like to make more Azure AI services offerings available as containers. Contact your local Microsoft account manager to get updates on new container releases and other Azure AI services announcements.

What is the service-level agreement (SLA) for Azure AI containers?

Important

To learn more about service-level agreements for Azure AI services, visit our SLA page.

Azure AI provides SLAs for cloud hosted services that can be viewed on our SLA page.

However, Azure AI services in containers don't provide an SLA as it is on-premises software. Customers control Azure AI services container configurations of resources, so Microsoft is unable to offer an SLA for general availability (GA). Customers are free to deploy containers on-premises and define the host environments.

Are these containers available in Sovereign clouds?

Standard Azure AI containers can be used in the Sovereign clouds. The containers can run and connect to the billing endpoint in these clouds once authorized, but container images must be pulled from the Public cloud container registry. Containers that rely on language models downloaded at runtime, such as Translator or Custom Speech to text, also are configured to pull models from Public endpoints.

Purchasing disconnected containers isn't currently supported in Sovereign clouds. Disconnected containers purchased in Public cloud along with all dependencies can be transferred to Sovereign clouds and run on Sovereign cloud infrastructure.

Versioning

How are containers updated to the latest version?

Customers can choose when to update the containers that they've deployed. Containers are marked with standard Docker tags such as latest to indicate the most recent version. We encourage customers to pull the latest versions of containers as they're released. For details on how to get notified when an image is updated, see Azure Container Registry webhooks.

Container license files are used as keys to decrypt certain files within each container image. If these encrypted files happen to be updated within a new container image, the license file you have may fail to start the container even if it worked with the previous version of the container image. To avoid this issue, we recommend that you download a new license file from the resource endpoint for your container provided in Azure portal after you pull new image versions from mcr.microsoft.com.

To download a new license file, you can add DownloadLicense=True to your docker run command along with a license mount, your API Key, and your billing endpoint. Refer to your container's documentation for detailed instructions.

What versions are supported?

Generally, only the current version of the container is supported. We encourage customers to stay current to get the latest patches and technology.

How are updates versioned?

Major version changes indicate that there's a breaking change to the API signature. We anticipate that this indication will generally coincide with major version changes to the corresponding Azure AI services cloud offering. Minor version changes indicate bug fixes, model updates, or new features that don't make a breaking change to the API signature.

Technical questions

How can I diagnose potential errors in my deployment environment?

If you're having trouble running an Azure AI services container, you can try using the Microsoft diagnostics container. Use this container to diagnose common errors in your deployment environment that might prevent Azure AI containers from functioning as expected.

To get the container, use the following docker pull command:

docker pull mcr.microsoft.com/azure-cognitive-services/diagnostic

Then run the container. Replace {ENDPOINT_URI} with your endpoint, and replace {API_KEY} with your key to your resource:

docker run --rm mcr.microsoft.com/azure-cognitive-services/diagnostic \
eula=accept \
Billing={ENDPOINT_URI} \
ApiKey={API_KEY}

The container will test for network connectivity to the billing endpoint.

How should I run the Azure AI containers on IoT devices?

Whether you don't have a reliable internet connection, you want to save on bandwidth cost, you have low-latency requirements, or you're dealing with sensitive data that needs to be analyzed on-site, Azure IoT Edge with Azure AI containers gives you consistency with the cloud.

Are these containers compatible with OpenShift?

We don't test containers with OpenShift, but generally, Azure AI containers should run on any platform that supports Docker images. If you're using OpenShift, we recommend running the containers as root-user.

How do I provide product feedback and feature recommendations?

We encourage customers to voice their concerns publicly and to up-vote others who have done the same where potential issues overlap. You can use the feedback tool for both product feedback and feature recommendations.

What status messages and errors do Azure AI containers return?

Here are the status messages and errors:

Status Description
Valid Your API key is valid. No action is needed.
Invalid Your API key is invalid. You must provide a valid API key to run the container. Find your API key and service region in the Azure portal, in the Keys and Endpoint section for your Azure AI services resource.
Mismatch You've provided an API key or endpoint for a different kind of Azure AI services resource. Find your API key and service region in the Azure portal, in the Keys and Endpoint section for your Azure AI services resource.
CouldNotConnect The container couldn't connect to the billing endpoint. Check the Retry-After value and wait for this period to end before you make more requests.
OutOfQuota The API key has exceeded the quota. You can either upgrade your pricing tier or wait for more quota to become available. Find your tier in the Azure portal, in the Pricing Tier section of your Azure AI service resource.
BillingEndpointBusy The billing endpoint is currently busy. Check the Retry-After value and wait for this period to end before you make more requests.
ContainerUseUnauthorized The provided API key isn't authorized for use with this container. You're likely using a gated container, so make sure your Azure subscription ID is approved by submitting an online request.
[ERROR] Failed to download: context deadline exceeded The model file download request to our servers timed out. Make sure you have a strong internet connection to download all required files within one hour. Model downloads apply to Text Translation and some Speech service containers.
The provided license path was not found. Please ensure a volume is mounted and a directory exists at the location specified by Mounts:License Disconnected containers only. There is no license file stored locally at the location specified in your docker run command. You may have mounted the license volume incorrectly. Be sure to check your local file system and provide a valid filepath to the desired local storage location for the license file using the -v argument in your docker run command.
Unknown The server is currently unable to process billing requests.

How do I get support?

Customer support channels are the same as for the Azure AI services cloud-based APIs. All Azure AI containers include logging features that help us and the community support customers. Here are options for more support:

How does billing work?

Customers are charged based on consumption, similar to the Azure AI services cloud. The containers need to be configured to send metering data to Azure, and transactions are billed accordingly. Resources used across the hosted and on-premises services add to the single quota with tiered pricing, counting against both usages. For more information, see the billing section of the container you're using.

Important

Azure AI containers are not licensed to run without being connected to Azure for metering. Customers need to enable the containers to always communicate billing information with the metering service. Azure AI containers don't send customer data to Microsoft.

Here's an example of the information that a container communicates for metering:

usageRequestBody": {
  "id": "1234abcd-1234-56ab-ab12-1234abcd",
  "containerType": "<container-type>",
  "containerVersion": "<container-version>",
  "containerId": "<contianer-id>",
  "meter": {
    "name": "<meter-name>",
    "quantity": 0.0
  },
  "requestTime": 12345687890,
  "apiType": "<api-type>"
},

What is the current support warranty for containers?

Microsoft's standard warranty for enterprise software applies for all containers formally announced as generally available (GA). There's no warranty for previews.

What happens to Azure AI containers when internet connectivity is lost?

Azure AI containers are not licensed to run without being connected to Azure for metering. Customers need to enable the containers to always communicate with the metering service.

How long can the container operate without being connected to Azure?

Azure AI containers are not licensed to run without being connected to Azure for metering. Customers need to enable the containers to always communicate with the metering service.

What hardware is required to run these containers?

Azure AI containers are x64-based containers that can run any compatible Linux node, VM, and edge device that supports x64 Linux Docker containers. They all require CPU processors. For more information, see the container requirements and recommendations section of the container you're using.

Are these containers currently supported on Windows?

Azure AI containers are Linux containers, but there's some support for Linux containers on Windows. For more information about Linux containers on Windows, see the Docker documentation.

How do Azure AI containers compare to AWS and Google offerings?

Microsoft is the first cloud provider to move its pretrained AI models in containers with simple billing per transaction as though customers are using a cloud service. Microsoft believes a hybrid cloud gives customers more choice.

What compliance certifications do containers have?

Azure AI containers don't have any compliance certifications.

What regions are Azure AI containers available in?

Containers can be run anywhere in any region, but they need a key and need to call back to Azure for metering. All supported regions for the cloud service are supported for the containers' metering call.