Scenarios for deploying Azure Load Testing in a virtual network

In this article, you'll learn about the scenarios for deploying Azure Load Testing Preview in a virtual network (VNET). This deployment is sometimes called VNET injection.

This functionality enables the following usage scenarios:

When you deploy Azure Load Testing in a virtual network, the load test engine virtual machines are attached to the virtual network in your subscription. The load test engines can then communicate with the other resources in the virtual network, such as the private application endpoint. You are not billed for the test engine compute resources.


When you deploy Azure Load Testing in a virtual network, you'll incur additional charges. Azure Load Testing deploys an Azure Load Balancer and a Public IP address in your subscription and there might be a cost for generated traffic. For more information, see the Virtual Network pricing information.

The following diagram provides a technical overview:

Diagram that shows the Azure Load Testing VNET injection technical overview.


Azure Load Testing is currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet released into general availability, see the Supplemental Terms of Use for Microsoft Azure Previews.

Scenario: Load test an Azure-hosted private endpoint

In this scenario, you've deployed an application endpoint in a virtual network on Azure, which isn't publicly accessible. For example, the endpoint could be behind an internal load balancer, or running on a VM with a private IP address.

Diagram that shows the set-up for load testing a private endpoint hosted on Azure.

When you deploy Azure Load Testing in the virtual network, the load test engines can now communicate with the application endpoint. If you've used separate subnets for the application endpoint and Azure Load Testing, make sure that communication between the subnets isn't blocked, for example by a network security group (NSG). Learn how network security groups filter network traffic.

Scenario: Load test a public endpoint with access restrictions

In this scenario, you've deployed a publicly available web service in Azure, or any other location. Access to the endpoint is restricted to specific client IP addresses. For example, the service could be running behind an Azure Application Gateway, hosted on Azure App Service with access restrictions, or deployed behind a web application firewall.

Diagram that shows the set-up for load testing a public endpoint hosted on Azure with access restrictions.

To restrict access to the endpoint for the load test engines, you need a range of public IP addresses for the test engine virtual machines. You deploy a NAT Gateway resource in the virtual network, and then create and run a load test in the virtual network. A NAT gateway is a fully managed Azure service that provides source network address translation (SNAT).

Attach the NAT gateway to the subnet in which the load test engines are injected. You can configure the public IP addresses used by the NAT gateway. These load test engine VMs use these IP addresses for generating load. You can then allowlist these IP addresses for restricting access to your application endpoint.

Scenario: Load test an on-premises hosted service, connected via Azure ExpressRoute

In this scenario, you have an on-premises application endpoint, which isn't publicly accessible. The on-premises environment is connected to Azure by using Azure ExpressRoute.

Diagram that shows the set-up for load testing an on-premises hosted, private endpoint connected via Azure ExpressRoute.

ExpressRoute lets you extend your on-premises networks into the Microsoft cloud over a private connection with the help of a connectivity provider. Deploy Azure Load Testing in an Azure virtual network and then connect the network to your ExpressRoute circuit. After you've set up the connection, the load test engines can connect to the on-premises hosted application endpoint.

Next steps