Template types & usage
Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019
Templates let you define reusable content, logic, and parameters. Templates function in two ways. You can insert reusable content with a template or you can use a template to control what is allowed in a pipeline. The second approach is useful for building secure pipelines with templates.
If a template is used to include content, it functions like an include directive in many programming languages. Content from one file is inserted into another file. When a template controls what is allowed in a pipeline, the template defines logic that another file must follow.
Use templates to define your logic once and then reuse it several times. Templates combine the content of multiple YAML files into a single pipeline. You can pass parameters into a template from your parent pipeline.
Parameters
You can specify parameters and their data types in a template and reference those parameters in a pipeline. With templateContext, you can also pass properties to stages, steps, and jobs that are used as parameters in a template.
You can also use parameters outside of templates. You can only use literals for parameter default values.
Passing parameters
Parameters must contain a name and data type. In azure-pipelines.yml
, when the parameter yesNo
is set to a boolean value, the build succeeds. When yesNo
is set to a string such as apples
, the build fails.
# File: simple-param.yml
parameters:
- name: yesNo # name of the parameter; required
type: boolean # data type of the parameter; required
default: false
steps:
- script: echo ${{ parameters.yesNo }}
# File: azure-pipelines.yml
trigger:
- main
extends:
template: simple-param.yml
parameters:
yesNo: false # set to a non-boolean value to have the build fail
Parameters to select a template at runtime
You can call different templates from a pipeline YAML depending on a condition. In this example, the experimental.yml
YAML will run when the parameter experimentalTemplate
is true.
#azure-pipeline.yml
parameters:
- name: experimentalTemplate
displayName: 'Use experimental build process?'
type: boolean
default: false
steps:
- ${{ if eq(parameters.experimentalTemplate, true) }}:
- template: experimental.yml
- ${{ if not(eq(parameters.experimentalTemplate, true)) }}:
- template: stable.yml
Parameter data types
Data type | Notes |
---|---|
string |
string |
number |
may be restricted to values: , otherwise any number-like string is accepted |
boolean |
true or false |
object |
any YAML structure |
step |
a single step |
stepList |
sequence of steps |
job |
a single job |
jobList |
sequence of jobs |
deployment |
a single deployment job |
deploymentList |
sequence of deployment jobs |
stage |
a single stage |
stageList |
sequence of stages |
The step, stepList, job, jobList, deployment, deploymentList, stage, and stageList data types all use standard YAML schema format. This example includes string, number, boolean, object, step, and stepList.
parameters:
- name: myString
type: string
default: a string
- name: myMultiString
type: string
default: default
values:
- default
- ubuntu
- name: myNumber
type: number
default: 2
values:
- 1
- 2
- 4
- 8
- 16
- name: myBoolean
type: boolean
default: true
- name: myObject
type: object
default:
foo: FOO
bar: BAR
things:
- one
- two
- three
nested:
one: apple
two: pear
count: 3
- name: myStep
type: step
default:
script: echo my step
- name: mySteplist
type: stepList
default:
- script: echo step one
- script: echo step two
trigger: none
jobs:
- job: stepList
steps: ${{ parameters.mySteplist }}
- job: myStep
steps:
- ${{ parameters.myStep }}
You can iterate through an object and print each string in the object.
parameters:
- name: listOfStrings
type: object
default:
- one
- two
steps:
- ${{ each value in parameters.listOfStrings }}:
- script: echo ${{ value }}
Additionally, you can iterate through nested elements within an object.
parameters:
- name: listOfFruits
type: object
default:
- fruitName: 'apple'
colors: ['red','green']
- fruitName: 'lemon'
colors: ['yellow']
steps:
- ${{ each fruit in parameters.listOfFruits }} :
- ${{ each fruitColor in fruit.colors}} :
- script: echo ${{ fruit.fruitName}} ${{ fruitColor }}
Extend from a template
To increase security, you can enforce that a pipeline extends from a particular template. The file start.yml
defines the parameter buildSteps
, which is then used in the pipeline azure-pipelines.yml
.
In start.yml
, if a buildStep
gets passed with a script step, then it is rejected and the pipeline build fails.
When extending from a template, you can increase security by adding a required template approval.
# File: start.yml
parameters:
- name: buildSteps # the name of the parameter is buildSteps
type: stepList # data type is StepList
default: [] # default value of buildSteps
stages:
- stage: secure_buildstage
pool:
vmImage: windows-latest
jobs:
- job: secure_buildjob
steps:
- script: echo This happens before code
displayName: 'Base: Pre-build'
- script: echo Building
displayName: 'Base: Build'
- ${{ each step in parameters.buildSteps }}:
- ${{ each pair in step }}:
${{ if ne(pair.value, 'CmdLine@2') }}:
${{ pair.key }}: ${{ pair.value }}
${{ if eq(pair.value, 'CmdLine@2') }}:
# Step is rejected by raising a YAML syntax error: Unexpected value 'CmdLine@2'
'${{ pair.value }}': error
- script: echo This happens after code
displayName: 'Base: Signing'
# File: azure-pipelines.yml
trigger:
- main
extends:
template: start.yml
parameters:
buildSteps:
- bash: echo Test #Passes
displayName: succeed
- bash: echo "Test"
displayName: succeed
# Step is rejected by raising a YAML syntax error: Unexpected value 'CmdLine@2'
- task: CmdLine@2
inputs:
script: echo "Script Test"
# Step is rejected by raising a YAML syntax error: Unexpected value 'CmdLine@2'
- script: echo "Script Test"
Extend from a template with resources
You can also use extends
to extend from a template in your Azure pipeline that contains resources.
# File: azure-pipelines.yml
trigger:
- none
extends:
template: resource-template.yml
# File: resource-template.yml
resources:
pipelines:
- pipeline: my-pipeline
source: sourcePipeline
steps:
- script: echo "Testing resource template"
Use templateContext to pass properties to templates
You can use templateContext
to pass additional properties to stages, steps, and jobs that are used as parameters in a template. Specifically, you can specify templateContext
within the jobList
, deploymentList
, or stageList
parameter data type.
You can use templateContext
to make it easier to set up environments when processing each job. By bundling a job and its environment properties object together, templateContext
can help you have more maintainable and easier to understand YAML.
In this example, the parameter testSet
in testing-template.yml
has the data type jobList
. The template testing-template.yml
creates a new variable testJob
using the each keyword. The template then references the testJob.templateContext.expectedHTTPResponseCode
, which gets set in azure-pipeline.yml
and passed to the template.
When response code is 200, the template makes a REST request. When the response code is 500, the template outputs all of the environment variables for debugging.
templateContext
can contain properties.
#testing-template.yml
parameters:
- name: testSet
type: jobList
jobs:
- ${{ each testJob in parameters.testSet }}:
- ${{ if eq(testJob.templateContext.expectedHTTPResponseCode, 200) }}:
- job:
steps:
- powershell: 'Invoke-RestMethod -Uri https://blogs.msdn.microsoft.com/powershell/feed/ | Format-Table -Property Title, pubDate'
- ${{ testJob.steps }}
- ${{ if eq(testJob.templateContext.expectedHTTPResponseCode, 500) }}:
- job:
steps:
- powershell: 'Get-ChildItem -Path Env:\'
- ${{ testJob.steps }}
#azure-pipeline.yml
trigger: none
pool:
vmImage: ubuntu-latest
extends:
template: testing-template.yml
parameters:
testSet:
- job: positive_test
templateContext:
expectedHTTPResponseCode: 200
steps:
- script: echo "Run positive test"
- job: negative_test
templateContext:
expectedHTTPResponseCode: 500
steps:
- script: echo "Run negative test"
Insert a template
You can copy content from one YAML and reuse it in a different YAML. Copying content from one YAML to another saves you from having to manually include the same logic in multiple places. The include-npm-steps.yml
file template contains steps that are reused in azure-pipelines.yml
.
Note
Template files need to exist on your filesystem at the start of a pipeline run. You can't reference templates in an artifact.
# File: templates/include-npm-steps.yml
steps:
- script: npm install
- script: yarn install
- script: npm run compile
# File: azure-pipelines.yml
jobs:
- job: Linux
pool:
vmImage: 'ubuntu-latest'
steps:
- template: templates/include-npm-steps.yml # Template reference
- job: Windows
pool:
vmImage: 'windows-latest'
steps:
- template: templates/include-npm-steps.yml # Template reference
Step reuse
You can insert a template to reuse one or more steps across several jobs. In addition to the steps from the template, each job can define more steps.
# File: templates/npm-steps.yml
steps:
- script: npm install
- script: npm test
# File: azure-pipelines.yml
jobs:
- job: Linux
pool:
vmImage: 'ubuntu-latest'
steps:
- template: templates/npm-steps.yml # Template reference
- job: macOS
pool:
vmImage: 'macOS-latest'
steps:
- template: templates/npm-steps.yml # Template reference
- job: Windows
pool:
vmImage: 'windows-latest'
steps:
- script: echo This script runs before the template's steps, only on Windows.
- template: templates/npm-steps.yml # Template reference
- script: echo This step runs after the template's steps.
Job reuse
Much like steps, jobs can be reused with templates.
# File: templates/jobs.yml
jobs:
- job: Ubuntu
pool:
vmImage: 'ubuntu-latest'
steps:
- bash: echo "Hello Ubuntu"
- job: Windows
pool:
vmImage: 'windows-latest'
steps:
- bash: echo "Hello Windows"
# File: azure-pipelines.yml
jobs:
- template: templates/jobs.yml # Template reference
When working with multiple jobs, remember to remove the name of the job in the template file, so as to avoid conflict
# File: templates/jobs.yml
jobs:
- job:
pool:
vmImage: 'ubuntu-latest'
steps:
- bash: echo "Hello Ubuntu"
- job:
pool:
vmImage: 'windows-latest'
steps:
- bash: echo "Hello Windows"
# File: azure-pipelines.yml
jobs:
- template: templates/jobs.yml # Template reference
- template: templates/jobs.yml # Template reference
- template: templates/jobs.yml # Template reference
Stage reuse
Stages can also be reused with templates.
# File: templates/stages1.yml
stages:
- stage: Angular
jobs:
- job: angularinstall
steps:
- script: npm install angular
# File: templates/stages2.yml
stages:
- stage: Build
jobs:
- job: build
steps:
- script: npm run build
# File: azure-pipelines.yml
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Install
jobs:
- job: npminstall
steps:
- task: Npm@1
inputs:
command: 'install'
- template: templates/stages1.yml
- template: templates/stages2.yml
Job, stage, and step templates with parameters
# File: templates/npm-with-params.yml
parameters:
- name: name # defaults for any parameters that aren't specified
default: ''
- name: vmImage
default: ''
jobs:
- job: ${{ parameters.name }}
pool:
vmImage: ${{ parameters.vmImage }}
steps:
- script: npm install
- script: npm test
When you consume the template in your pipeline, specify values for the template parameters.
# File: azure-pipelines.yml
jobs:
- template: templates/npm-with-params.yml # Template reference
parameters:
name: Linux
vmImage: 'ubuntu-latest'
- template: templates/npm-with-params.yml # Template reference
parameters:
name: macOS
vmImage: 'macOS-latest'
- template: templates/npm-with-params.yml # Template reference
parameters:
name: Windows
vmImage: 'windows-latest'
You can also use parameters with step or stage templates. For example, steps with parameters:
# File: templates/steps-with-params.yml
parameters:
- name: 'runExtendedTests' # defaults for any parameters that aren't specified
type: boolean
default: false
steps:
- script: npm test
- ${{ if eq(parameters.runExtendedTests, true) }}:
- script: npm test --extended
When you consume the template in your pipeline, specify values for the template parameters.
# File: azure-pipelines.yml
steps:
- script: npm install
- template: templates/steps-with-params.yml # Template reference
parameters:
runExtendedTests: 'true'
Note
Scalar parameters without a specified type are treated as strings.
For example, eq(true, parameters['myparam'])
will return true
, even if the myparam
parameter is the word false
, if myparam
is not explicitly made boolean
.
Non-empty strings are cast to true
in a Boolean context.
That expression could be rewritten to explicitly compare strings: eq(parameters['myparam'], 'true')
.
Parameters aren't limited to scalar strings.
See the list of data types.
For example, using the object
type:
# azure-pipelines.yml
jobs:
- template: process.yml
parameters:
pool: # this parameter is called `pool`
vmImage: ubuntu-latest # and it's a mapping rather than a string
# process.yml
parameters:
- name: 'pool'
type: object
default: {}
jobs:
- job: build
pool: ${{ parameters.pool }}
Variable reuse
Variables can be defined in one YAML and included in another template. This could be useful if you want to store all of your variables in one file. If you are using a template to include variables in a pipeline, the included template can only be used to define variables. You can use steps and more complex logic when you're extending from a template. Use parameters instead of variables when you want to restrict type.
In this example, the variable favoriteVeggie
is included in azure-pipelines.yml
.
# File: vars.yml
variables:
favoriteVeggie: 'brussels sprouts'
# File: azure-pipelines.yml
variables:
- template: vars.yml # Template reference
steps:
- script: echo My favorite vegetable is ${{ variables.favoriteVeggie }}.
Variable templates with parameter
You can pass parameters to variables with templates. In this example, you're passing the DIRECTORY
parameter to a RELEASE_COMMAND
variable.
# File: templates/package-release-with-params.yml
parameters:
- name: DIRECTORY
type: string
default: "." # defaults for any parameters that specified with "." (current directory)
variables:
- name: RELEASE_COMMAND
value: grep version ${{ parameters.DIRECTORY }}/package.json | awk -F \" '{print $4}'
When you consume the template in your pipeline, specify values for the template parameters.
# File: azure-pipelines.yml
variables: # Global variables
- template: package-release-with-params.yml # Template reference
parameters:
DIRECTORY: "azure/checker"
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Release_Stage
displayName: Release Version
variables: # Stage variables
- template: package-release-with-params.yml # Template reference
parameters:
DIRECTORY: "azure/todo-list"
jobs:
- job: A
steps:
- bash: $(RELEASE_COMMAND) #output release command
Reference template paths
Template paths should be relative to the file that does the including. Here's an example nested hierarchy.
|
+-- fileA.yml
|
+-- dir1/
|
+-- fileB.yml
|
+-- dir2/
|
+-- fileC.yml
Then, in fileA.yml
you can reference fileB.yml
and fileC.yml
like this.
steps:
- template: dir1/fileB.yml
- template: dir1/dir2/fileC.yml
If fileC.yml
is your starting point, you can include fileA.yml
and fileB.yml
like this.
steps:
- template: ../../fileA.yml
- template: ../fileB.yml
When fileB.yml
is your starting point, you can include fileA.yml
and fileC.yml
like this.
steps:
- template: ../fileA.yml
- template: dir2/fileC.yml
Use other repositories
You can keep your templates in other repositories. For example, suppose you have a core pipeline that you want all of your app pipelines to use. You can put the template in a core repo and then refer to it from each of your app repos:
# Repo: Contoso/BuildTemplates
# File: common.yml
parameters:
- name: 'vmImage'
default: 'ubuntu 16.04'
type: string
jobs:
- job: Build
pool:
vmImage: ${{ parameters.vmImage }}
steps:
- script: npm install
- script: npm test
Now you can reuse this template in multiple pipelines.
Use the resources
specification to provide the location of the core repo.
When you refer to the core repo, use @
and the name you gave it in resources
.
# Repo: Contoso/LinuxProduct
# File: azure-pipelines.yml
resources:
repositories:
- repository: templates
type: github
name: Contoso/BuildTemplates
jobs:
- template: common.yml@templates # Template reference
# Repo: Contoso/WindowsProduct
# File: azure-pipelines.yml
resources:
repositories:
- repository: templates
type: github
name: Contoso/BuildTemplates
ref: refs/tags/v1.0 # optional ref to pin to
jobs:
- template: common.yml@templates # Template reference
parameters:
vmImage: 'windows-latest'
For type: github
, name
is <identity>/<repo>
as in the examples above.
For type: git
(Azure Repos), name
is <project>/<repo>
.
If that project is in a separate Azure DevOps organization, you'll need to configure a service connection of type Azure Repos/Team Foundation Server
with access to the project and include that in YAML:
resources:
repositories:
- repository: templates
name: Contoso/BuildTemplates
endpoint: myServiceConnection # Azure DevOps service connection
jobs:
- template: common.yml@templates
Repositories are resolved only once, when the pipeline starts up. After that, the same resource is used for the duration of the pipeline. Only the template files are used. Once the templates are fully expanded, the final pipeline runs as if it were defined entirely in the source repo. This means that you can't use scripts from the template repo in your pipeline.
If you want to use a particular, fixed version of the template, be sure to pin to a ref
.
The refs
are either branches (refs/heads/<name>
) or tags (refs/tags/<name>
).
If you want to pin a specific commit, first create a tag pointing to that commit, then pin to that tag.
Note
If no ref
is specified, the pipeline will default to using refs/heads/main
.
You may also use @self
to refer to the repository where the original pipeline was found.
This is convenient for use in extends
templates if you want to refer back to contents in the extending pipeline's repository.
For example:
# Repo: Contoso/Central
# File: template.yml
jobs:
- job: PreBuild
steps: []
# Template reference to the repo where this template was
# included from - consumers of the template are expected
# to provide a "BuildJobs.yml"
- template: BuildJobs.yml@self
- job: PostBuild
steps: []
# Repo: Contoso/MyProduct
# File: azure-pipelines.yml
resources:
repositories:
- repository: templates
type: git
name: Contoso/Central
extends:
template: template.yml@templates
# Repo: Contoso/MyProduct
# File: BuildJobs.yml
jobs:
- job: Build
steps: []
Template expressions
Use template expressions to specify how values are dynamically resolved during pipeline initialization.
Wrap your template expression inside this syntax: ${{ }}
.
Template expressions can expand template parameters, and also variables.
You can use parameters to influence how a template is expanded.
The parameters
object works like the variables
object
in an expression. Only predefined variables can be used in template expressions.
Note
Expressions are only expanded for stages
, jobs
, steps
, and containers
(inside resources
).
You cannot, for example, use an expression inside trigger
or a resource like repositories
.
Additionally, on Azure DevOps 2020 RTW, you can't use template expressions inside containers
.
For example, you define a template:
# File: steps/msbuild.yml
parameters:
- name: 'solution'
default: '**/*.sln'
type: string
steps:
- task: msbuild@1
inputs:
solution: ${{ parameters['solution'] }} # index syntax
- task: vstest@2
inputs:
solution: ${{ parameters.solution }} # property dereference syntax
Then you reference the template and pass it the optional solution
parameter:
# File: azure-pipelines.yml
steps:
- template: steps/msbuild.yml
parameters:
solution: my.sln
Context
Within a template expression, you have access to the parameters
context that contains the values of parameters passed in.
Additionally, you have access to the variables
context that contains all the variables specified in the YAML file plus
many of the predefined variables (noted on each variable in that topic).
Importantly, it doesn't have runtime variables such as those stored on the pipeline or given when you start a run.
Template expansion happens early in the run, so those variables aren't available.
Required parameters
You can add a validation step at the beginning of your template to check for the parameters you require.
Here's an example that checks for the solution
parameter using Bash (which enables it to work on any platform):
# File: steps/msbuild.yml
parameters:
- name: 'solution'
default: ''
type: string
steps:
- bash: |
if [ -z "$SOLUTION" ]; then
echo "##vso[task.logissue type=error;]Missing template parameter \"solution\""
echo "##vso[task.complete result=Failed;]"
fi
env:
SOLUTION: ${{ parameters.solution }}
displayName: Check for required parameters
- task: msbuild@1
inputs:
solution: ${{ parameters.solution }}
- task: vstest@2
inputs:
solution: ${{ parameters.solution }}
To show that the template fails if it's missing the required parameter:
# File: azure-pipelines.yml
# This will fail since it doesn't set the "solution" parameter to anything,
# so the template will use its default of an empty string
steps:
- template: steps/msbuild.yml
Template expression functions
You can use general functions in your templates. You can also use a few template expression functions.
format
- Simple string token replacement
- Min parameters: 2. Max parameters: N
- Example:
${{ format('{0} Build', parameters.os) }}
→'Windows Build'
coalesce
- Evaluates to the first non-empty, non-null string argument
- Min parameters: 2. Max parameters: N
- Example:
parameters:
- name: 'restoreProjects'
default: ''
type: string
- name: 'buildProjects'
default: ''
type: string
steps:
- script: echo ${{ coalesce(parameters.foo, parameters.bar, 'Nothing to see') }}
Insertion
You can use template expressions to alter the structure of a YAML pipeline. For instance, to insert into a sequence:
# File: jobs/build.yml
parameters:
- name: 'preBuild'
type: stepList
default: []
- name: 'preTest'
type: stepList
default: []
- name: 'preSign'
type: stepList
default: []
jobs:
- job: Build
pool:
vmImage: 'windows-latest'
steps:
- script: cred-scan
- ${{ parameters.preBuild }}
- task: msbuild@1
- ${{ parameters.preTest }}
- task: vstest@2
- ${{ parameters.preSign }}
- script: sign
# File: .vsts.ci.yml
jobs:
- template: jobs/build.yml
parameters:
preBuild:
- script: echo hello from pre-build
preTest:
- script: echo hello from pre-test
When an array is inserted into an array, the nested array is flattened.
To insert into a mapping, use the special property ${{ insert }}
.
# Default values
parameters:
- name: 'additionalVariables'
type: object
default: {}
jobs:
- job: build
variables:
configuration: debug
arch: x86
${{ insert }}: ${{ parameters.additionalVariables }}
steps:
- task: msbuild@1
- task: vstest@2
jobs:
- template: jobs/build.yml
parameters:
additionalVariables:
TEST_SUITE: L0,L1
Conditional insertion
If you want to conditionally insert into a sequence or a mapping in a template, use insertions and expression evaluation. You can also use if
statements outside of templates as long as you use template syntax.
For example, to insert into a sequence in a template:
# File: steps/build.yml
parameters:
- name: 'toolset'
default: msbuild
type: string
values:
- msbuild
- dotnet
steps:
# msbuild
- ${{ if eq(parameters.toolset, 'msbuild') }}:
- task: msbuild@1
- task: vstest@2
# dotnet
- ${{ if eq(parameters.toolset, 'dotnet') }}:
- task: dotnet@1
inputs:
command: build
- task: dotnet@1
inputs:
command: test
# File: azure-pipelines.yml
steps:
- template: steps/build.yml
parameters:
toolset: dotnet
For example, to insert into a mapping in a template:
# File: steps/build.yml
parameters:
- name: 'debug'
type: boolean
default: false
steps:
- script: tool
env:
${{ if eq(parameters.debug, true) }}:
TOOL_DEBUG: true
TOOL_DEBUG_DIR: _dbg
steps:
- template: steps/build.yml
parameters:
debug: true
You can also use conditional insertion for variables. In this example, start
always prints and this is a test
only prints when the foo
variable equals test
.
variables:
- name: foo
value: test
pool:
vmImage: 'ubuntu-latest'
steps:
- script: echo "start" # always runs
- ${{ if eq(variables.foo, 'test') }}:
- script: echo "this is a test" # runs when foo=test
Iterative insertion
The each
directive allows iterative insertion based on a YAML sequence (array) or mapping (key-value pairs).
For example, you can wrap the steps of each job with other pre- and post-steps:
# job.yml
parameters:
- name: 'jobs'
type: jobList
default: []
jobs:
- ${{ each job in parameters.jobs }}: # Each job
- ${{ each pair in job }}: # Insert all properties other than "steps"
${{ if ne(pair.key, 'steps') }}:
${{ pair.key }}: ${{ pair.value }}
steps: # Wrap the steps
- task: SetupMyBuildTools@1 # Pre steps
- ${{ job.steps }} # Users steps
- task: PublishMyTelemetry@1 # Post steps
condition: always()
# azure-pipelines.yml
jobs:
- template: job.yml
parameters:
jobs:
- job: A
steps:
- script: echo This will get sandwiched between SetupMyBuildTools and PublishMyTelemetry.
- job: B
steps:
- script: echo So will this!
You can also manipulate the properties of whatever you're iterating over. For example, to add more dependencies:
# job.yml
parameters:
- name: 'jobs'
type: jobList
default: []
jobs:
- job: SomeSpecialTool # Run your special tool in its own job first
steps:
- task: RunSpecialTool@1
- ${{ each job in parameters.jobs }}: # Then do each job
- ${{ each pair in job }}: # Insert all properties other than "dependsOn"
${{ if ne(pair.key, 'dependsOn') }}:
${{ pair.key }}: ${{ pair.value }}
dependsOn: # Inject dependency
- SomeSpecialTool
- ${{ if job.dependsOn }}:
- ${{ job.dependsOn }}
# azure-pipelines.yml
jobs:
- template: job.yml
parameters:
jobs:
- job: A
steps:
- script: echo This job depends on SomeSpecialTool, even though it's not explicitly shown here.
- job: B
dependsOn:
- A
steps:
- script: echo This job depends on both Job A and on SomeSpecialTool.
Escape a value
If you need to escape a value that literally contains ${{
, then wrap the value in an expression string. For example, ${{ 'my${{value' }}
or ${{ 'my${{value with a '' single quote too' }}
Imposed limits
Templates and template expressions can cause explosive growth to the size and complexity of a pipeline. To help prevent runaway growth, Azure Pipelines imposes the following limits:
- No more than 100 separate YAML files may be included (directly or indirectly)
- No more than 20 levels of template nesting (templates including other templates)
- No more than 10 megabytes of memory consumed while parsing the YAML (in practice, this is typically between 600 KB - 2 MB of on-disk YAML, depending on the specific features used)
Template parameters
You can pass parameters to templates.
The parameters
section defines what parameters are available in the template and their default values.
Templates are expanded just before the pipeline runs so that values surrounded by ${{ }}
are replaced by the parameters it receives from the enclosing pipeline. As a result, only predefined variables can be used in parameters.
To use parameters across multiple pipelines, see how to create a variable group.
Job, stage, and step templates with parameters
# File: templates/npm-with-params.yml
parameters:
name: '' # defaults for any parameters that aren't specified
vmImage: ''
jobs:
- job: ${{ parameters.name }}
pool:
vmImage: ${{ parameters.vmImage }}
steps:
- script: npm install
- script: npm test
When you consume the template in your pipeline, specify values for the template parameters.
# File: azure-pipelines.yml
jobs:
- template: templates/npm-with-params.yml # Template reference
parameters:
name: Linux
vmImage: 'ubuntu-latest'
- template: templates/npm-with-params.yml # Template reference
parameters:
name: macOS
vmImage: 'macOS-10.13'
- template: templates/npm-with-params.yml # Template reference
parameters:
name: Windows
vmImage: 'windows-latest'
You can also use parameters with step or stage templates. For example, steps with parameters:
# File: templates/steps-with-params.yml
parameters:
runExtendedTests: 'false' # defaults for any parameters that aren't specified
steps:
- script: npm test
- ${{ if eq(parameters.runExtendedTests, 'true') }}:
- script: npm test --extended
When you consume the template in your pipeline, specify values for the template parameters.
# File: azure-pipelines.yml
steps:
- script: npm install
- template: templates/steps-with-params.yml # Template reference
parameters:
runExtendedTests: 'true'
Note
Scalar parameters are always treated as strings.
For example, eq(parameters['myparam'], true)
will almost always return true
, even if the myparam
parameter is the word false
.
Non-empty strings are cast to true
in a Boolean context.
That expression could be rewritten to explicitly compare strings: eq(parameters['myparam'], 'true')
.
Parameters aren't limited to scalar strings. As long as the place where the parameter expands expects a mapping, the parameter can be a mapping. Likewise, sequences can be passed where sequences are expected. For example:
# azure-pipelines.yml
jobs:
- template: process.yml
parameters:
pool: # this parameter is called `pool`
vmImage: ubuntu-latest # and it's a mapping rather than a string
# process.yml
parameters:
pool: {}
jobs:
- job: build
pool: ${{ parameters.pool }}
Using other repositories
You can keep your templates in other repositories. For example, suppose you have a core pipeline that you want all of your app pipelines to use. You can put the template in a core repo and then refer to it from each of your app repos:
# Repo: Contoso/BuildTemplates
# File: common.yml
parameters:
vmImage: 'ubuntu 16.04'
jobs:
- job: Build
pool:
vmImage: ${{ parameters.vmImage }}
steps:
- script: npm install
- script: npm test
Now you can reuse this template in multiple pipelines.
Use the resources
specification to provide the location of the core repo.
When you refer to the core repo, use @
and the name you gave it in resources
.
# Repo: Contoso/LinuxProduct
# File: azure-pipelines.yml
resources:
repositories:
- repository: templates
type: github
name: Contoso/BuildTemplates
jobs:
- template: common.yml@templates # Template reference
# Repo: Contoso/WindowsProduct
# File: azure-pipelines.yml
resources:
repositories:
- repository: templates
type: github
name: Contoso/BuildTemplates
ref: refs/tags/v1.0 # optional ref to pin to
jobs:
- template: common.yml@templates # Template reference
parameters:
vmImage: 'windows-latest'
For type: github
, name
is <identity>/<repo>
as in the examples above.
For type: git
(Azure Repos), name
is <project>/<repo>
.
The project must be in the same organization; cross-organization references aren't supported.
Repositories are resolved only once, when the pipeline starts up. After that, the same resource is used during the pipeline. Only the template files are used. Once the templates are fully expanded, the final pipeline runs as if it were defined entirely in the source repo. This means that you can't use scripts from the template repo in your pipeline.
If you want to use a particular, fixed version of the template, be sure to pin to a ref.
Refs are either branches (refs/heads/<name>
) or tags (refs/tags/<name>
).
If you want to pin a specific commit, first create a tag pointing to that commit, then pin to that tag.
Expressions
Use template expressions to specify how values are dynamically resolved during pipeline initialization.
Wrap your template expression inside this syntax: ${{ }}
.
Template expressions can expand template parameters, and also variables.
You can use parameters to influence how a template is expanded.
The parameters
object works like the variables
object
in an expression.
For example, you define a template:
# File: steps/msbuild.yml
parameters:
solution: '**/*.sln'
steps:
- task: msbuild@1
inputs:
solution: ${{ parameters['solution'] }} # index syntax
- task: vstest@2
inputs:
solution: ${{ parameters.solution }} # property dereference syntax
Then you reference the template and pass it the optional solution
parameter:
# File: azure-pipelines.yml
steps:
- template: steps/msbuild.yml
parameters:
solution: my.sln
Context
Within a template expression, you have access to the parameters
context, which contains the values of parameters passed in.
Additionally, you have access to the variables
context, which contains all the variables specified in the YAML file plus
the system variables.
Importantly, it doesn't have runtime variables such as those stored on the pipeline or given when you start a run.
Template expansion happens early in the run, so those variables aren't available.
Required parameters
You can add a validation step at the beginning of your template to check for the parameters you require.
Here's an example that checks for the solution
parameter using Bash (which enables it to work on any platform):
# File: steps/msbuild.yml
parameters:
solution: ''
steps:
- bash: |
if [ -z "$SOLUTION" ]; then
echo "##vso[task.logissue type=error;]Missing template parameter \"solution\""
echo "##vso[task.complete result=Failed;]"
fi
env:
SOLUTION: ${{ parameters.solution }}
displayName: Check for required parameters
- task: msbuild@1
inputs:
solution: ${{ parameters.solution }}
- task: vstest@2
inputs:
solution: ${{ parameters.solution }}
To show that the template fails if it's missing the required parameter:
# File: azure-pipelines.yml
# This will fail since it doesn't set the "solution" parameter to anything,
# so the template will use its default of an empty string
steps:
- template: steps/msbuild.yml
Template expression functions
You can use general functions in your templates. You can also use a few template expression functions.
format
- Simple string token replacement
- Min parameters: 2. Max parameters: N
- Example:
${{ format('{0} Build', parameters.os) }}
→'Windows Build'
coalesce
- Evaluates to the first nonempty, non-null string argument
- Min parameters: 2. Max parameters: N
- Example:
parameters:
restoreProjects: ''
buildProjects: ''
steps:
- script: echo ${{ coalesce(parameters.foo, parameters.bar, 'Nothing to see') }}
Insertion
You can use template expressions to alter the structure of a YAML pipeline. For instance, to insert into a sequence:
# File: jobs/build.yml
parameters:
preBuild: []
preTest: []
preSign: []
jobs:
- job: Build
pool:
vmImage: 'windows-latest'
steps:
- script: cred-scan
- ${{ parameters.preBuild }}
- task: msbuild@1
- ${{ parameters.preTest }}
- task: vstest@2
- ${{ parameters.preSign }}
- script: sign
# File: .vsts.ci.yml
jobs:
- template: jobs/build.yml
parameters:
preBuild:
- script: echo hello from pre-build
preTest:
- script: echo hello from pre-test
When an array is inserted into an array, the nested array is flattened.
To insert into a mapping, use the special property ${{ insert }}
.
# Default values
parameters:
additionalVariables: {}
jobs:
- job: build
variables:
configuration: debug
arch: x86
${{ insert }}: ${{ parameters.additionalVariables }}
steps:
- task: msbuild@1
- task: vstest@2
jobs:
- template: jobs/build.yml
parameters:
additionalVariables:
TEST_SUITE: L0,L1
Conditional insertion
If you want to conditionally insert into a sequence or a mapping, then use insertions and expression evaluation.
For example, to insert into a sequence:
# File: steps/build.yml
parameters:
toolset: msbuild
steps:
# msbuild
- ${{ if eq(parameters.toolset, 'msbuild') }}:
- task: msbuild@1
- task: vstest@2
# dotnet
- ${{ if eq(parameters.toolset, 'dotnet') }}:
- task: dotnet@1
inputs:
command: build
- task: dotnet@1
inputs:
command: test
# File: azure-pipelines.yml
steps:
- template: steps/build.yml
parameters:
toolset: dotnet
For example, to insert into a mapping:
# File: steps/build.yml
parameters:
debug: false
steps:
- script: tool
env:
${{ if eq(parameters.debug, 'true') }}:
TOOL_DEBUG: true
TOOL_DEBUG_DIR: _dbg
steps:
- template: steps/build.yml
parameters:
debug: true
Iterative insertion
The each
directive allows iterative insertion based on a YAML sequence (array) or mapping (key-value pairs).
For example, you can wrap the steps of each job with extra pre- and post-steps:
# job.yml
parameters:
jobs: []
jobs:
- ${{ each job in parameters.jobs }}: # Each job
- ${{ each pair in job }}: # Insert all properties other than "steps"
${{ if ne(pair.key, 'steps') }}:
${{ pair.key }}: ${{ pair.value }}
steps: # Wrap the steps
- task: SetupMyBuildTools@1 # Pre steps
- ${{ job.steps }} # Users steps
- task: PublishMyTelemetry@1 # Post steps
condition: always()
# azure-pipelines.yml
jobs:
- template: job.yml
parameters:
jobs:
- job: A
steps:
- script: echo This will get sandwiched between SetupMyBuildTools and PublishMyTelemetry.
- job: B
steps:
- script: echo So will this!
You can also manipulate the properties of whatever you're iterating over. For example, to add more dependencies:
# job.yml
parameters:
jobs: []
jobs:
- job: SomeSpecialTool # Run your special tool in its own job first
steps:
- task: RunSpecialTool@1
- ${{ each job in parameters.jobs }}: # Then do each job
- ${{ each pair in job }}: # Insert all properties other than "dependsOn"
${{ if ne(pair.key, 'dependsOn') }}:
${{ pair.key }}: ${{ pair.value }}
dependsOn: # Inject dependency
- SomeSpecialTool
- ${{ if job.dependsOn }}:
- ${{ job.dependsOn }}
# azure-pipelines.yml
jobs:
- template: job.yml
parameters:
jobs:
- job: A
steps:
- script: echo This job depends on SomeSpecialTool, even though it's not explicitly shown here.
- job: B
dependsOn:
- A
steps:
- script: echo This job depends on both Job A and on SomeSpecialTool.
Escaping
If you need to escape a value that literally contains ${{
, then wrap the value in an expression string. For example, ${{ 'my${{value' }}
or ${{ 'my${{value with a '' single quote too' }}
Limits
Templates and template expressions can cause explosive growth to the size and complexity of a pipeline. To help prevent runaway growth, Azure Pipelines imposes the following limits:
- No more than 50 separate YAML files may be included (directly or indirectly)
- No more than 10 megabytes of memory consumed while parsing the YAML (in practice, this is typically between 600 KB - 2 MB of on-disk YAML, depending on the specific features used)
- No more than 2000 characters per template expression are allowed
FAQ
How can I use variables inside of templates?
There are times when it may be useful to set parameters to values based on variables. Parameters are expanded early in processing a pipeline run so not all variables will be available. To see what predefined variables are available in templates, see Use predefined variables.
In this example, the predefined variables Build.SourceBranch
and Build.Reason
are used in conditions in template.yml.
# File: azure-pipelines.yml
trigger:
- main
extends:
template: template.yml
# File: template.yml
steps:
- script: echo Build.SourceBranch = $(Build.SourceBranch) # outputs refs/heads/main
- script: echo Build.Reason = $(Build.Reason) # outputs IndividualCI
- ${{ if eq(variables['Build.SourceBranch'], 'refs/heads/main') }}:
- script: echo I run only if Build.SourceBranch = refs/heads/main
- ${{ if eq(variables['Build.Reason'], 'IndividualCI') }}:
- script: echo I run only if Build.Reason = IndividualCI
- script: echo I run after the conditions
Feedback
Submit and view feedback for