Working Llama example for Container Apps?

Zach Howell 45 Reputation points
2025-10-13T18:38:19.6+00:00

I tried creating a Container App running Llama 3 by following this tutorial:
[https://learn.microsoft.com/en-us/azure/container-apps/serverless-gpu-nim?tabs=bash ](https://learn.microsoft.com/en-us/azure/container-apps/serverless-gpu-nim?tabs=bash

)

However my container app did not start, instead giving "Activation failed" errors. I looked at some logs but couldn't figure out an exact reason for the failure.. AI models don't give the best logging errors at the best of times, and it was not 100% clear I was getting all the right logs in ACA.

I filed https://github.com/microsoft/azure-container-apps/issues/1583 as well on the tutorial.

But I don't necessarily need that tutorial fixed; I'd just like to run Llama 3 on ACA. I tried also with a command like:
az containerapp create --name hzcontllama3 --resource-group $RESOURCE_GROUP --environment $CONTAINERAPPS_ENVIRONMENT --image $ACR_NAME.azurecr.io/$CONTAINER_AND_TAG --cpu 24 --memory 220 --target-port 8000 --ingress external --secrets hftoken=$HF_TOKEN --env-vars hftoken=secretref:hftoken --registry-server $ACR_NAME.azurecr.io --workload-profile-name LLAMA_PROFILE --query properties.configuration.ingress.fqdn --args "--model-id=meta-llama/Llama-3.2-1B-Instruct --max-concurrent-requests 1"

``where I used us-docker.pkg.dev/deeplearning-platform-release/gcr.io/huggingface-text-generation-inference-cu121.2-2.ubuntu2204.py310 , a container which worked for me on Cloud Run from their tutorial: https://cloud.google.com/blog/products/ai-machine-learning/how-to-deploy-llama-3-2-1b-instruct-model-with-google-cloud-run

What am I doing wrong? Anyone have a working example using Llama (3?) for Azure Container Apps? How best should I debug erorrs?

Azure Container Apps
Azure Container Apps
An Azure service that provides a general-purpose, serverless container platform.
{count} votes

1 answer

Sort by: Most helpful
  1. Pashikanti Kumar 1,725 Reputation points Microsoft External Staff Moderator
    2025-10-13T22:54:48.2166667+00:00

    Hi Zach Howell,

    Based on your logs and the error "Probe of StartUp failed with status code: 1" plus the message "Target resource doesn't exist," here are focused troubleshooting steps and solutions for running Llama 3 on Azure Container Apps successfully

    • The startup probe failing usually means the container app’s readiness/liveness probe can't correctly detect your container is ready.
    • ACA uses probes on ports (default 80 or configured port). Make sure your container listens on the correct port and responds with a healthy HTTP status.
    • Action: Confirm that your app inside the container is listening on the ACA-configured target port (e.g., 80 or 8000).
    • If your app listens on port 8000, configure the container app ingress and probes to use port 8000 explicitly.

     

    Mostly this is a port/probe mismatch or container startup failure. Align container listening port with ACA ingress and probes, allocate the required GPU resources, and validate logs carefully to fix activation errors.

    Target resource doesn't exist

    This error indicates your application is trying to access a resource (e.g., file, directory, or external service) that is unavailable or misconfigured

     

    Probe Configuration

    The startup probe failure suggests ACA is unable to confirm that your app is ready. Ensure the probe is correctly configured.

    Reference

    az containerapp logs | Microsoft Learn

    Enable Application Insights for deeper telemetry and debugging:

    Application logging in Azure Container Apps | Microsoft Learn

    Kindly let us know if the above helps or you need further assistance on this issue.

    Please "Upvote" if the information helped you. This will help us and others in the community as well.

    Thanks

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.