JMeter implementation for a load testing pipeline

Azure Container Instances
Azure Pipelines
Azure Container Registry

Solution ideas

This article is a solution idea. If you'd like us to expand the content with more information, such as potential use cases, alternative services, implementation considerations, or pricing guidance, let us know by providing GitHub feedback.

This article provides an overview of an implementation for a scalable cloud load testing pipeline.


Diagram of a load testing pipeline with JMeter, ACI, and Terraform.

Download a Visio file of this architecture.


The Microsoft CSE team structured the load testing implementation into two Azure Pipelines:

  1. One pipeline builds a custom JMeter Docker container and pushes the image to Azure Container Registry (ACR). This structure provides flexibility for adding any JMeter plugin.

  2. The other pipeline:

    1. Validates the JMeter test definition (.jmx file).

    2. Dynamically provisions the load testing infrastructure.

    3. Runs the load test.

    4. Publishes the test results and artifacts to Azure Pipelines.

    5. Destroys the infrastructure.

First the solution creates and runs the Docker pipeline, and then it creates the JMeter pipeline.

An Azure Pipelines triggers and controls the flow. During setup, the solution provisions JMeter agents as ACI instances using the Remote Testing approach.

A JMeter controller:

  • Configures all workers using its own protocol.

  • Combines all load testing results.

  • Generates resulting artifacts like dashboards and logs.

Docker pipeline and JMeter pipeline definition files are in YAML (.yml) format. The files contain setting like branch, path, variable, and so on. First the solution creates the pipelines. Then the developer can run the JMeter pipeline from the command line. They run the pipeline by defining which JMeter test definition file and the number of JMeter workers required for the test.

To integrate with Azure test results, the solution uses a Python script to convert the JMeter test results format (.jtl file) to JUnit format (.xml file).

sample of Azure Pipelines Dashboard Displaying Successful Requests


Scenario details

This article provides an overview of an implementation for a scalable cloud load testing pipeline. The testing pipeline does a lot to carry out stress testing:

  • Creates infrastructure on-demand.

  • Deploys the infrastructure.

  • Executes testing.

  • Reports results.

  • Destroys infrastructure on-demand.

The implementation uses Apache JMeter and Terraform to provision and destroy the required infrastructure from Azure. It also enables observation and viewing of test results. The commercial software engineer (CSE) team used it to help a customer create a banking system cloud transformation solution.

This implementation enables the following capabilities:

  • Viewing combined data in a dashboard to monitor the scalability and performance of a solution infrastructure.

  • The ability to determine:

    • The impact of infrastructure scalability.

    • The reaction to failures in the existing architectural design and various workloads.

    The CSE team made these determinations by observing a set of simulations. They ran functional scenarios in the simulations and monitored the performance and scalability of the infrastructure.

  • Supports any system that exposes a JMeter supported endpoint. For example: Azure Container Instances (ACI), Azure Kubernetes Service (AKS), and so on. Carries out pod/node autoscaling and performance tests on all services.

The implementation also supports:

  • Executing performance tests over the microservices until the solution reaches or surpasses a target of a set number of transactions per second.

  • Executing horizontal pod/node autoscaling tests over microservices.

  • Providing observability on specific solution component(s) by activating metrics captured (for example, with Prometheus and Grafana).

  • Providing a detailed report about the tests executed, the applications' behavior and the partitioning strategies adopted where applicable (for example, Kafka).

This implementation provides the following advantages:

  • Full integration with Azure.

  • Alternative to other proprietary/deprecating solutions.

  • Fully open-source.

Potential use cases

This solution is ideal for any scenario in which there's a need to evaluate the capability of different infrastructure designs and configurations to handle different types of loads.

Next steps