Deploy and run workflows with the Dapr extension for Azure Kubernetes Service (AKS)

With Dapr Workflow, you can easily orchestrate messaging, state management, and failure-handling logic across various microservices. Dapr Workflow can help you create long-running, fault-tolerant, and stateful applications.

In this guide, you use the provided order processing workflow example to:

  • Create an Azure Container Registry and an AKS cluster for this sample.
  • Install the Dapr extension on your AKS cluster.
  • Deploy the sample application to AKS.
  • Start and query workflow instances using HTTP API calls.

The workflow example is an ASP.NET Core project with:

Note

Dapr Workflow is currently a beta feature and is on a self-service, opt-in basis. Beta Dapr APIs and components are provided "as is" and "as available," and are continually evolving as they move toward stable status. Beta APIs and components are not covered by customer support.

Prerequisites

Set up the environment

Clone the sample project

Clone the example workflow application.

git clone https://github.com/Azure/dapr-workflows-aks-sample.git

Navigate to the sample's root directory.

cd dapr-workflows-aks-sample

Create a Kubernetes cluster

Create a resource group to hold the AKS cluster.

az group create --name myResourceGroup --location eastus

Create an AKS cluster.

az aks create --resource-group myResourceGroup --name myAKSCluster --node-count 2 --generate-ssh-keys 

Make sure kubectl is installed and pointed to your AKS cluster. If you use the Azure Cloud Shell, kubectl is already installed.

For more information, see the Deploy an AKS cluster tutorial.

Deploy the application to AKS

Install Dapr on your AKS cluster

Install the Dapr extension on your AKS cluster. Before you start, make sure you've:

az k8s-extension create --cluster-type managedClusters --cluster-name myAKSCluster --resource-group myResourceGroup --name dapr --extension-type Microsoft.Dapr

Verify Dapr has been installed by running the following command:

kubectl get pods -A

Deploy the Redis Actor state store component

Navigate to the Deploy directory in your forked version of the sample:

cd Deploy

Deploy the Redis component:

helm repo add bitnami https://charts.bitnami.com/bitnami
helm install redis bitnami/redis
kubectl apply -f redis.yaml

Run the application

Once you've deployed Redis, deploy the application to AKS:

kubectl apply -f deployment.yaml

Expose the Dapr sidecar and the sample app:

kubectl apply -f service.yaml
export APP_URL=$(kubectl get svc/workflows-sample -o jsonpath='{.status.loadBalancer.ingress[0].ip}')
export DAPR_URL=$(kubectl get svc/workflows-sample-dapr -o jsonpath='{.status.loadBalancer.ingress[0].ip}')

Verify that the above commands were exported:

echo $APP_URL
echo $DAPR_URL

Start the workflow

Now that the application and Dapr have been deployed to the AKS cluster, you can now start and query workflow instances. Begin by making an API call to the sample app to restock items in the inventory:

curl -X GET $APP_URL/stock/restock

Start the workflow:

curl -X POST $DAPR_URL/v1.0-alpha1/workflows/dapr/OrderProcessingWorkflow/1234/start \
  -H "Content-Type: application/json" \
  -d '{ "input" : {"Name": "Paperclips", "TotalCost": 99.95, "Quantity": 1}}'

Expected output:

{"instance_id":"1234"}

Check the workflow status:

curl -X GET $DAPR_URL/v1.0-alpha1/workflows/dapr/OrderProcessingWorkflow/1234

Expected output:

{
  "WFInfo":
    {
      "instance_id":"1234"
    },
    "start_time":"2023-03-03T19:19:16Z",
    "metadata":
    {
      "dapr.workflow.custom_status":"",
      "dapr.workflow.input":"{\"Name\":\"Paperclips\",\"Quantity\":1,\"TotalCost\":99.95}",
      "dapr.workflow.last_updated":"2023-03-03T19:19:33Z",
      "dapr.workflow.name":"OrderProcessingWorkflow",
      "dapr.workflow.output":"{\"Processed\":true}",
      "dapr.workflow.runtime_status":"COMPLETED"
    }
}

Notice that the workflow status is marked as completed.

Next steps

Learn how to add configuration settings to the Dapr extension on your AKS cluster.