Editja

Ixxerja permezz ta’


Tutorial: Build an agentic web app in Azure App Service with LangGraph or Foundry Agent Service (Python)

This tutorial demonstrates how to add agentic capability to an existing data-driven FastAPI CRUD application. It does this using two different approaches: LangGraph and Foundry Agent Service.

If your web application already has useful features, like shopping, hotel booking, or data management, it's relatively straightforward to add agent functionality to your web application by wrapping those functionalities in a plugin (for LangGraph) or as an OpenAPI endpoint (for Foundry Agent Service). In this tutorial, you start with a simple to-do list app. By the end, you'll be able to create, update, and manage tasks with an agent in an App Service app.

Both LangGraph and Foundry Agent Service enable you to build agentic web applications with AI-driven capabilities. LangGraph is similar to Microsoft Agent Framework and is an SDK. The following table shows some of the considerations and trade-offs:

Consideration LangGraph or Microsoft Agent Framework Foundry Agent Service
Performance Fast (runs locally) Slower (managed, remote service)
Development Full code, maximum control Low code, rapid integration
Testing Manual/unit tests in code Built-in playground for quick testing
Scalability App-managed Azure-managed, autoscaled
Security guardrails Custom implementation required Built-in content safety and moderation
Identity Custom implementation required Built-in agent ID and authentication
Enterprise Custom integration required Built-in Microsoft 365/Teams deployment and Microsoft 365 integrated tool calls.

In this tutorial, you learn how to:

  • Convert existing app functionality into a plugin for LangGraph.
  • Add the plugin to a LangGraph agent and use it in a web app.
  • Convert existing app functionality into an OpenAPI endpoint for Foundry Agent Service.
  • Call a Foundry agent in a web app.
  • Assign the required permissions for managed identity connectivity.

Prerequisites

Open the sample with Codespaces

The easiest way to get started is by using GitHub Codespaces, which provides a complete development environment with all required tools preinstalled.

  1. Navigate to the GitHub repository at https://github.com/Azure-Samples/app-service-agentic-langgraph-foundry-python.

  2. Select the Code button, select the Codespaces tab, and select Create codespace on main.

  3. Wait a few moments for your Codespace to initialize. When ready, you'll see a fully configured development environment in your browser.

  4. Run the application locally:

    python3 -m venv venv
    source venv/bin/activate
    pip install -r requirements.txt
    uvicorn src.app:app --host 0.0.0.0 --port 3000
    
  5. When you see Your application running on port 3000 is available, select Open in Browser and add a few tasks.

    The agents aren't fully configured so they don't work yet. You'll configure them later.

Review the agent code

Both approaches use the same implementation pattern, where the agent is initialized on application start, and responds to user messages by POST requests.

The LangGraphTaskAgent is initialized in the constructor in src/agents/langgraph_task_agent.py. The initialization code does the following:

  • Configures the AzureChatOpenAI client using environment variables.
  • Creates the prebuilt ReAct agent with memory and a set of CRUD tools for task management (see LangGraph quickstart).
# Initialize Azure OpenAI client
credential = DefaultAzureCredential()
azure_ad_token_provider = get_bearer_token_provider(
    credential, "https://cognitiveservices.azure.com/.default"
)

self.llm = AzureChatOpenAI(
    azure_endpoint=endpoint,
    azure_deployment=deployment_name,
    azure_ad_token_provider=azure_ad_token_provider,
    api_version="2024-10-21"
)

# Define tools
tools = [
    self._create_task_tool(),
    self._get_tasks_tool(),
    self._get_task_tool(),
    self._update_task_tool(),
    self._delete_task_tool()
]

# Create the agent
self.agent = create_react_agent(self.llm, tools, checkpointer=self.memory)
print("LangGraph Task Agent initialized successfully")

When processing user messages, the agent is invoked using ainvoke() with the user's message and a thread ID for conversation continuity:

result = await self.agent.ainvoke(
    {"messages": [("user", message)]},
    config=config
)

Deploy the sample application

The sample repository contains an Azure Developer CLI (AZD) template, which creates an App Service app with managed identity and deploys your sample application.

  1. In the terminal, log into Azure using Azure Developer CLI:

    azd auth login
    

    Follow the instructions to complete the authentication process.

  2. Deploy the Azure App Service app with the AZD template:

    azd up
    
  3. When prompted, give the following answers:

    Question Answer
    Enter a new environment name: Type a unique name.
    Select an Azure Subscription to use: Select the subscription.
    Pick a resource group to use: Select Create a new resource group.
    Select a location to create the resource group in: Select Sweden Central.
    Enter a name for the new resource group: Type Enter.
  4. In the AZD output, find the URL of your app and navigate to it in the browser. The URL looks like this in the AZD output:

     Deploying services (azd deploy)
    
       (✓) Done: Deploying service web
       - Endpoint: <URL>
     
  5. Open the autogenerated OpenAPI schema at the https://....azurewebsites.net/openapi.json path. You need this schema later.

    You now have an App Service app with a system-assigned managed identity.

Create and configure the Microsoft Foundry resource

  1. In the Foundry portal, make sure the top New Foundry radio button is set to active and create a project.

  2. Deploy a model of your choice (see Microsoft Foundry Quickstart: Create resources).

  3. From top of the model playground, copy the model name.

  4. The easiest way to get the Azure OpenAI endpoint is still from the classic portal. Select the New Foundry radio button, then Azure OpenAI, and then copy the URL in Azure OpenAI endpoint for later.

    Screenshot showing how to copy the OpenAI endpoint and the foundry project endpoint in the foundry portal.

Assign required permissions

  1. From the top menu of the new Foundry portal, select Operate, then select Admin. In the row for your Foundry project, you should see two links. The one in the Name column is the Foundry project resource, and the one in the Parent resource column is the Foundry resource.

    Screenshot showing how to quickly go to the foundry resource or foundry project resource.

  2. Select the Foundry resource in the Parent resource and then select Manage this resource in the Azure portal. From the Azure portal, you can assign role-based access for the resource to the deployed web app.

  3. Add the following role for the App Service app's managed identity:

    Target resource Required role Needed for
    Foundry Cognitive Services OpenAI User The chat completion service in Microsoft Agent Framework.

    For instructions, see Assign Azure roles using the Azure portal.

Configure connection variables in your sample application

  1. Open .env. Using the values you copied earlier from the Foundry portal, configure the following variables:

    Variable Description
    AZURE_OPENAI_ENDPOINT Azure OpenAI endpoint (copied from the classic Foundry portal).
    AZURE_OPENAI_DEPLOYMENT_NAME Model name in the deployment (copied from the model playground in the new Foundry portal).

    Note

    To keep the tutorial simple, you'll use these variables in .env instead of overwriting them with app settings in App Service.

    Note

    To keep the tutorial simple, you'll use these variables in .env instead of overwriting them with app settings in App Service.

  2. Sign in to Azure with the Azure CLI:

    az login
    

    This allows the Azure Identity client library in the sample code to receive an authentication token for the logged in user. Remember that you added the required role for this user earlier.

  3. Run the application locally:

    npm run build
    npm start
    
  4. When you see Your application running on port 3000 is available, select Open in Browser.

  5. Select the LangGraph Agent link and the Foundry Agent link to try out the chat interface. If you get a response, your application is connecting successfully to the Microsoft Foundry resource.

  6. Back in the GitHub codespace, deploy your app changes.

    azd up
    
  7. Navigate to the deployed application again and test the chat agents.

Clean up resources

When you're done with the application, you can delete the App Service resources to avoid incurring further costs:

azd down --purge

Since the AZD template doesn't include the Microsoft Foundry resources, you need to delete them manually if you want.

More resources