Редактиране

Споделяне чрез


Tutorial: Use code interpreter sessions in LlamaIndex with Azure Container Apps

LlamaIndex is a powerful framework for building context-augmented language model (LLM) applications. When you build an AI agent with LlamaIndex, an LLM interprets user input and generates a response. The AI agent often struggles when it needs to perform mathematical and symbolic reasoning to produce a response. By integrating Azure Container Apps dynamic sessions with LlamaIndex, you give the agent a code interpreter to use to perform specialized tasks.

In this tutorial, you learn how to run a LlamaIndex AI agent in a web API. The API accepts user input and returns a response generated by the AI agent. The agent uses a code interpreter in dynamic sessions to perform calculations.

Note

Azure Container Apps dynamic sessions is currently in preview. See preview limitations for more information.

Prerequisites

Create Azure resources

The sample app in this quickstart uses an LLM from Azure OpenAI. It also uses Azure Container Apps sessions to run code generated by the LLM.

  1. Update the Azure CLI to the latest version.

     az upgrade
    
  2. Remove the Azure Container Apps extension if it's already installed and install a preview version the Azure Container Apps extension containing commands for sessions:

    az extension remove --name containerapp
    az extension add \
        --name containerapp \
        --allow-preview true -y
    
  3. Sign in to Azure:

    az login
    
  4. Set the variables used in this quickstart:

    RESOURCE_GROUP_NAME=aca-sessions-tutorial
    AZURE_OPENAI_LOCATION=swedencentral
    AZURE_OPENAI_NAME=<UNIQUE_OPEN_AI_NAME>
    SESSION_POOL_LOCATION=eastasia
    SESSION_POOL_NAME=code-interpreter-pool
    

    Replace <UNIQUE_OPEN_AI_NAME> with a unique name to create your Azure OpenAI account.

  5. Create a resource group:

    az group create --name $RESOURCE_GROUP_NAME --location $SESSION_POOL_LOCATION
    
  6. Create an Azure OpenAI account:

    az cognitiveservices account create \
        --name $AZURE_OPENAI_NAME \
        --resource-group $RESOURCE_GROUP_NAME \
        --location $AZURE_OPENAI_LOCATION \
        --kind OpenAI \
        --sku s0 \
        --custom-domain $AZURE_OPENAI_NAME
    
  7. Create a GPT 3.5 Turbo model deployment named gpt-35-turbo in the Azure OpenAI account:

    az cognitiveservices account deployment create \
        --resource-group $RESOURCE_GROUP_NAME \
        --name $AZURE_OPENAI_NAME \
        --deployment-name gpt-35-turbo \
        --model-name gpt-35-turbo \
        --model-version "1106" \
        --model-format OpenAI \
        --sku-capacity "100" \
        --sku-name "Standard"
    
  8. Create a code interpreter session pool:

    az containerapp sessionpool create \
        --name $SESSION_POOL_NAME \
        --resource-group $RESOURCE_GROUP_NAME \
        --location $SESSION_POOL_LOCATION \
        --max-sessions 100 \
        --container-type PythonLTS \
        --cooldown-period 300
    

Run the sample app locally

Before you deploy the app to Azure Container Apps, you can run it locally to test it.

Clone the app

  1. Clone the Azure Container Apps sessions samples repository.

    git clone https://github.com/Azure-Samples/container-apps-dynamic-sessions-samples.git
    
  2. Change to the directory that contains the sample app:

    cd container-apps-dynamic-sessions-samples/llamaindex-python-webapi
    

Configure the app

  1. Create a Python virtual environment and activate it:

    python3.11 -m venv .venv
    source .venv/bin/activate
    

    Change the Python version in the command if you're using a different version. It's recommended to use Python 3.10 or later.

    Note

    If you're using Windows, replace .venv/bin/activate with .venv\Scripts\activate.

  2. Install the required Python packages:

    python -m pip install -r requirements.txt
    
  3. To run the app, you need to configure environment variables.

    1. Retrieve the Azure OpenAI account endpoint:

      az cognitiveservices account show \
          --name $AZURE_OPENAI_NAME \
          --resource-group $RESOURCE_GROUP_NAME \
          --query properties.endpoint \
          --output tsv
      
    2. Retrieve the Azure Container Apps session pool management endpoint:

      az containerapp sessionpool show \
          --name $SESSION_POOL_NAME \
          --resource-group $RESOURCE_GROUP_NAME \
          --query properties.poolManagementEndpoint \
          --output tsv
      
    3. Create a .env file in the root of the sample app directory (same location as main.py). Add the following content to the file:

      AZURE_OPENAI_ENDPOINT=<AZURE_OPENAI_ENDPOINT>
      POOL_MANAGEMENT_ENDPOINT=<SESSION_POOL_MANAGEMENT_ENDPOINT>
      

      Replace <AZURE_OPENAI_ENDPOINT> with the Azure OpenAI account endpoint and <SESSION_POOL_MANAGEMENT_ENDPOINT> with the session pool management endpoint.

  4. The app uses DefaultAzureCredential to authenticate with Azure services. On your local machine, it uses your current Azure CLI login credentials. You must give yourself the Cognitive Services OpenAI User role on the Azure OpenAI account for the app to access the model endpoints, and the Azure ContainerApps Session Executor role on the session pool for the app to access the session pool.

    1. Retrieve your Azure CLI user name:

      az account show --query user.name --output tsv
      
    2. Run the following commands to retrieve the Azure OpenAI account resource ID:

      az cognitiveservices account show --name $AZURE_OPENAI_NAME --resource-group $RESOURCE_GROUP_NAME --query id --output tsv
      
    3. Assign the Cognitive Services OpenAI User role to your Azure CLI user on the Azure OpenAI account:

      az role assignment create --role "Cognitive Services OpenAI User" --assignee <CLI_USERNAME> --scope <AZURE_OPENAI_RESOURCE_ID>
      

      Replace <CLI_USERNAME> with your Azure CLI user name and <AZURE_OPENAI_RESOURCE_ID> with the Azure OpenAI account resource ID.

    4. Run the following commands to retrieve the session pool resource ID:

      az containerapp sessionpool show --name $SESSION_POOL_NAME --resource-group $RESOURCE_GROUP_NAME --query id --output tsv
      
    5. Assign the Azure ContainerApps Session Executor role using its ID to your Azure CLI user on the session pool:

      az role assignment create \
          --role "Azure ContainerApps Session Executor" \
          --assignee <CLI_USERNAME> \
          --scope <SESSION_POOL_RESOURCE_ID>
      

      Replace <CLI_USERNAME> with your Azure CLI user name and <SESSION_POOL_RESOURCE_ID> with the session pool resource ID.

Run the app

Before running the sample app, open main.py in an editor and review the code. The app uses FastAPI to create a web API that accepts a user message in the query string.

The following lines of code instantiate a AzureCodeInterpreterToolSpec and provide it to the LlamaIndex agent:

code_interpreter_tool = AzureCodeInterpreterToolSpec(
    pool_managment_endpoint=pool_management_endpoint,
)
agent = ReActAgent.from_tools(code_interpreter_tool.to_tool_list(), llm=llm, verbose=True)

When it needs to perform calculations, the agent uses the code interpreter in AzureCodeInterpreterToolSpec to run the code. The code is executed in a session in the session pool. By default, a random session identifier is generated when you instantiate the tool. If the agent uses the same tool to run multiple Python code snippets, it uses the same session. To ensure each end user has a unique session, use a separate agent and tool for each user.

AzureCodeInterpreterToolSpec is available in the llama-index-tools-azure-code-interpreter package.

  1. Run the sample app:

    fastapi dev main.py
    
  2. Open a browser and navigate to http://localhost:8000/docs. You see the Swagger UI for the sample app.

  3. Expand the /chat endpoint and select Try it out.

  4. Enter What time is it right now? in the message field and select Execute.

    The agent responds with the current time. In the terminal, you see the logs showing the agent generated Python code to get the current time and ran it in a code interpreter session.

  5. To stop the app, enter Ctrl+C in the terminal.

Optional: Deploy the sample app to Azure Container Apps

To deploy the FastAPI app to Azure Container Apps, you need to create a container image and push it to a container registry. Then you can deploy the image to Azure Container Apps. The az containerapp up command combines these steps into a single command.

You then need to configure managed identity for the app and assign it the proper roles to access Azure OpenAI and the session pool.

  1. Set the variables for the Container Apps environment and app name:

    ENVIRONMENT_NAME=aca-sessions-tutorial-env
    CONTAINER_APP_NAME=chat-api
    
  2. Build and deploy the app to Azure Container Apps:

    az containerapp up \
        --name $CONTAINER_APP_NAME \
        --resource-group $RESOURCE_GROUP_NAME \
        --location $SESSION_POOL_LOCATION \
        --environment $ENVIRONMENT_NAME \
        --env-vars "AZURE_OPENAI_ENDPOINT=<OPEN_AI_ENDPOINT>" "POOL_MANAGEMENT_ENDPOINT=<SESSION_POOL_MANAGMENT_ENDPOINT>" \
        --source .
    

    Replace <OPEN_AI_ENDPOINT> with the Azure OpenAI account endpoint and <SESSION_POOL_MANAGMENT_ENDPOINT> with the session pool management endpoint.

  3. Enable the system-assigned managed identity for the app:

    az containerapp identity assign \
        --name $CONTAINER_APP_NAME \
        --resource-group $RESOURCE_GROUP_NAME \
        --system-assigned
    
  4. For the app to access Azure OpenAI and the session pool, you need to assign the managed identity the proper roles.

    1. Retrieve the managed identity's principal ID:

      az containerapp show \
          --name $CONTAINER_APP_NAME \
          --resource-group $RESOURCE_GROUP_NAME \
          --query identity.principalId \
          --output tsv
      
    2. Retrieve the session pool resource ID:

      az containerapp sessionpool show \
          --name $SESSION_POOL_NAME \
          --resource-group $RESOURCE_GROUP_NAME \
          --query id \
          --output tsv
      
    3. Assign the managed identity the Azure ContainerApps Session Executor and Contributor roles on the session pool:

      Before you run the following command, replace <PRINCIPAL_ID> and <SESSION_POOL_RESOURCE_ID> with the values you retrieved in the previous steps.

      az role assignment create \
          --role "Azure ContainerApps Session Executor" \
          --assignee <PRINCIPAL_ID> \
          --scope <SESSION_POOL_RESOURCE_ID>
      
      az role assignment create \
          --role "Contributor" \
          --assignee <PRINCIPAL_ID> \
          --scope <SESSION_POOL_RESOURCE_ID>
      
    4. Retrieve the Azure OpenAI account resource ID:

      az cognitiveservices account show \
          --name $AZURE_OPENAI_NAME \
          --resource-group $RESOURCE_GROUP_NAME \
          --query id \
          --output tsv
      
    5. Assign the managed identity the Cognitive Services OpenAI User role on the Azure OpenAI account:

      Before you run the following command, replace <PRINCIPAL_ID> and <AZURE_OPENAI_RESOURCE_ID> with the values you retrieved in the previous steps.

      az role assignment create \
          --role "Cognitive Services OpenAI User" \
          --assignee <PRINCIPAL_ID> \
          --scope <AZURE_OPENAI_RESOURCE_ID>
      
  5. Retrieve the app's fully qualified domain name (FQDN):

    az containerapp show \
        --name $CONTAINER_APP_NAME \
        --resource-group $RESOURCE_GROUP_NAME \
        --query properties.configuration.ingress.fqdn \
        --output tsv
    
  6. Open the browser to https://<FQDN>/docs to test the deployed app.

Clean up resources

When you're done with the resources, you can delete them to avoid incurring charges:

az group delete --name $RESOURCE_GROUP_NAME --yes --no-wait

Next steps