Training
Module
Create a multistage pipeline by using Azure Pipelines - Training
Design and create a realistic release pipeline that promotes changes to various testing and staging environments.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019
Important
Select the version of this article that corresponds to your platform and version. The version selector is above the table of contents. Look up your Azure DevOps platform and version.
Expressions can be used in many places where you need to specify a string, boolean, or number value when authoring a pipeline. When an expression returns an array, normal indexing rules apply and the index starts with 0
.
The most common use of expressions is in conditions to determine whether a job or step should run.
# Expressions are used to define conditions for a step, job, or stage
steps:
- task: ...
condition: <expression>
Another common use of expressions is in defining variables.
Expressions can be evaluated at compile time or at run time.
Compile time expressions can be used anywhere; runtime expressions can be used in variables and conditions. Runtime expressions are intended as a way to compute the contents of variables and state (example: condition
).
# Two examples of expressions used to define variables
# The first one, a, is evaluated when the YAML file is compiled into a plan.
# The second one, b, is evaluated at runtime.
# Note the syntax ${{}} for compile time and $[] for runtime expressions.
variables:
a: ${{ <expression> }}
b: $[ <expression> ]
The difference between runtime and compile time expression syntaxes is primarily what context is available.
In a compile-time expression (${{ <expression> }}
), you have access to parameters
and statically defined variables
.
In a runtime expression ($[ <expression> ]
), you have access to more variables
but no parameters.
In this example, a runtime expression sets the value of $(isMain)
. A static variable in a compile expression sets the value of $(compileVar)
.
variables:
staticVar: 'my value' # static variable
compileVar: ${{ variables.staticVar }} # compile time expression
isMain: $[eq(variables['Build.SourceBranch'], 'refs/heads/main')] # runtime expression
steps:
- script: |
echo ${{variables.staticVar}} # outputs my value
echo $(compileVar) # outputs my value
echo $(isMain) # outputs True
An expression can be a literal, a reference to a variable, a reference to a dependency, a function, or a valid nested combination of these.
As part of an expression, you can use boolean, null, number, string, or version literals.
# Examples
variables:
someBoolean: ${{ true }} # case insensitive, so True or TRUE also works
someNumber: ${{ -1.2 }}
someString: ${{ 'a b c' }}
someVersion: ${{ 1.2.3 }}
True
and False
are boolean literal expressions.
Null is a special literal expression that's returned from a dictionary miss, for example (variables['noSuch']
). Null can be the output of an expression but can't be called directly within an expression.
Starts with '-', '.', or '0' through '9'.
Must be single-quoted. For example: 'this is a string'
.
To express a literal single-quote, escape it with a single quote.
For example: 'It''s OK if they''re using contractions.'
.
You can use a pipe character (|
) for multiline strings.
myKey: |
one
two
three
A version number with up to four segments.
Must start with a number and contain two or three period (.
) characters.
For example: 1.2.3.4
.
As part of an expression, you may access variables using one of two syntaxes:
variables['MyVar']
variables.MyVar
In order to use property dereference syntax, the property name must:
a-Z
or _
a-Z
0-9
or _
Depending on the execution context, different variables are available.
Variables are always strings. If you want to use typed values, then you should use parameters instead.
Note
There is a limitation for using variables with expressions for both Classical and YAML pipelines when setting up such variables via variables tab UI. Variables that are defined as expressions shouldn't depend on another variable with expression in value since it isn't guaranteed that both expressions will be evaluated properly. For example we have variable a
whose value $[ <expression> ]
is used as a part for the value of variable b
. Since the order of processing variables isn't guaranteed variable b
could have an incorrect value of variable a
after evaluation.
Described constructions are only allowed while setup variables through variables keyword in YAML pipeline. It is required to place the variables in the order they should be processed to get the correct values after processing.
The following built-in functions can be used in expressions.
True
if all parameters are True
False
and(eq(variables.letters, 'ABC'), eq(variables.numbers, 123))
coalesce(variables.couldBeNull, variables.couldAlsoBeNull, 'literal so it always works')
True
if left parameter String contains right parametercontains('ABCDE', 'BCD')
(returns True)True
if the left parameter is an array, and any item equals the right parameter. Also evaluates True
if the left parameter is an object, and the value of any property equals the right parameter.False
if the conversion fails.Note
There is no literal syntax in a YAML pipeline for specifying an array. This function is of limited use in general pipelines. It's intended for use in the pipeline decorator context with system-provided arrays such as the list of steps.
You can use the containsValue
expression to find a matching value in an object. Here's an example that demonstrates looking in list of source branches for a match for Build.SourceBranch
.
parameters:
- name: branchOptions
displayName: Source branch options
type: object
default:
- refs/heads/main
- refs/heads/test
jobs:
- job: A1
steps:
- ${{ each value in parameters.branchOptions }}:
- script: echo ${{ value }}
- job: B1
condition: ${{ containsValue(parameters.branchOptions, variables['Build.SourceBranch']) }}
steps:
- script: echo "Matching branch found"
parameters:
- name: listOfValues
type: object
default:
this_is:
a_complex: object
with:
- one
- two
steps:
- script: |
echo "${MY_JSON}"
env:
MY_JSON: ${{ convertToJson(parameters.listOfValues) }}
Script output:
{
"this_is": {
"a_complex": "object",
"with": [
"one",
"two"
]
}
}
prefix
and seed
.prefix
should use UTF-16 characters.You can create a counter that is automatically incremented by one in each execution of your pipeline. When you define a counter, you provide a prefix
and a seed
. Here's an example that demonstrates this.
variables:
major: 1
# define minor as a counter with the prefix as variable major, and seed as 100.
minor: $[counter(variables['major'], 100)]
steps:
- bash: echo $(minor)
The value of minor
in the above example in the first run of the pipeline is 100. In the second run it is 101, provided the value of major
is still 1.
If you edit the YAML file, and update the value of the variable major
to be 2, then in the next run of the pipeline, the value of minor
will be 100. Subsequent runs increment the counter to 101, 102, 103, ...
Later, if you edit the YAML file, and set the value of major
back to 1, then the value of the counter resumes where it left off for that prefix. In this example, it resumes at 102.
Here's another example of setting a variable to act as a counter that starts at 100, gets incremented by 1 for every run, and gets reset to 100 every day.
Note
pipeline.startTime
is not available outside of expressions. pipeline.startTime
formats system.pipelineStartTime
into a date and time object so that it is available to work with expressions.
The default time zone for pipeline.startTime
is UTC. You can change the time zone for your organization.
jobs:
- job:
variables:
a: $[counter(format('{0:yyyyMMdd}', pipeline.startTime), 100)]
steps:
- bash: echo $(a)
Here's an example of having a counter that maintains a separate value for PRs and CI runs.
variables:
patch: $[counter(variables['build.reason'], 0)]
Counters are scoped to a pipeline. In other words, its value is incremented for each run of that pipeline. There are no project-scoped counters.
True
if left parameter String ends with right parameterendsWith('ABCDE', 'DE')
(returns True)True
if parameters are equalFalse
if conversion fails.eq(variables.letters, 'ABC')
format('Hello {0} {1}', 'John', 'Doe')
yyyy
, yy
, MM
, M
, dd
, d
, HH
, H
, m
, mm
, ss
, s
, f
, ff
, ffff
, K
)format('{0:yyyyMMdd}', pipeline.startTime)
. In this case pipeline.startTime
is a special date time object variable.format('literal left brace {{ and literal right brace }}')
True
if left parameter is greater than or equal to the right parameterge(5, 5)
(returns True)True
if left parameter is greater than the right parametergt(5, 2)
(returns True)True
if left parameter is equal to any right parameterFalse
if conversion fails.in('B', 'A', 'B', 'C')
(returns True)In this example, a semicolon gets added between each item in the array. The parameter type is an object.
parameters:
- name: myArray
type: object
default:
- FOO
- BAR
- ZOO
variables:
A: ${{ join(';',parameters.myArray) }}
steps:
- script: echo $A # outputs FOO;BAR;ZOO
True
if left parameter is less than or equal to the right parameterle(2, 2)
(returns True)length('fabrikam')
returns 8lower('FOO')
returns foo
True
if left parameter is less than the right parameterlt(2, 5)
(returns True)True
if parameters are not equalTrue
if conversion fails.ne(1, 2)
(returns True)True
if parameter is False
not(eq(1, 2))
(returns True)True
if left parameter isn't equal to any right parameterFalse
if conversion fails.notIn('D', 'A', 'B', 'C')
(returns True)True
if any parameter is True
True
or(eq(1, 1), eq(2, 3))
(returns True, short-circuits)replace(a, b, c)
: returns a, with all instances of b replaced by creplace('https://www.tinfoilsecurity.com/saml/consume','https://www.tinfoilsecurity.com','http://server')
(returns http://server/saml/consume
)variables:
- name: environments
value: prod1,prod2
steps:
- ${{ each env in split(variables.environments, ',')}}:
- script: ./deploy.sh --environment ${{ env }}
parameters:
- name: resourceIds
type: object
default:
- /subscriptions/mysubscription/resourceGroups/myResourceGroup/providers/Microsoft.Network/loadBalancers/kubernetes-internal
- /subscriptions/mysubscription02/resourceGroups/myResourceGroup02/providers/Microsoft.Network/loadBalancers/kubernetes
- name: environments
type: object
default:
- prod1
- prod2
trigger:
- main
steps:
- ${{ each env in parameters.environments }}:
- ${{ each resourceId in parameters.resourceIds }}:
- script: echo ${{ replace(split(resourceId, '/')[8], '-', '_') }}_${{ env }}
True
if left parameter string starts with right parameterstartsWith('ABCDE', 'AB')
(returns True)upper('bah')
returns BAH
True
if exactly one parameter is True
xor(True, False)
(returns True)You can use the following status check functions as expressions in conditions, but not in variable definitions.
True
(even when canceled). Note: A critical failure may still prevent a task from running. For example, if getting sources failed.True
if the pipeline was canceled.eq(variables['Agent.JobStatus'], 'Failed')
.True
only if any previous job in the dependency graph failed.True
only if any of those jobs failed.in(variables['Agent.JobStatus'], 'Succeeded', 'SucceededWithIssues')
dependsOn
when working with jobs and you want to evaluate whether a previous job was successful. Jobs are designed to run in parallel while stages run sequentially.True
only if all previous jobs in the dependency graph succeeded or partially succeeded.True
if all of those jobs succeeded or partially succeeded.False
if the pipeline is canceled.For a step, equivalent to in(variables['Agent.JobStatus'], 'Succeeded', 'SucceededWithIssues', 'Failed')
For a job:
True
regardless of whether any jobs in the dependency graph succeeded or failed.True
whether any of those jobs succeeded or failed.not(canceled())
instead when there are previous skipped jobs in the dependency graph.This is like
always()
, except it will evaluateFalse
when the pipeline is canceled.
You can use if
, elseif
, and else
clauses to conditionally assign variable values or set inputs for tasks. You can also conditionally run a step when a condition is met.
You can use if
to conditionally assign variable values or set inputs for tasks. You can also conditionally run a step when a condition is met.
The elseif
and else
clauses are available starting with Azure DevOps 2022 and aren't available for Azure DevOps Server 2020 and earlier versions of Azure DevOps.
Conditionals only work when using template syntax. Learn more about variable syntax.
For templates, you can use conditional insertion when adding a sequence or mapping. Learn more about conditional insertion in templates.
variables:
${{ if eq(variables['Build.SourceBranchName'], 'main') }}: # only works if you have a main branch
stageName: prod
pool:
vmImage: 'ubuntu-latest'
steps:
- script: echo ${{variables.stageName}}
pool:
vmImage: 'ubuntu-latest'
steps:
- task: PublishPipelineArtifact@1
inputs:
targetPath: '$(Pipeline.Workspace)'
${{ if eq(variables['Build.SourceBranchName'], 'main') }}:
artifact: 'prod'
${{ else }}:
artifact: 'dev'
publishLocation: 'pipeline'
If there's no variable set, or the value of foo
doesn't match the if
conditions, the else
statement runs. Here the value of foo
returns true in the elseif
condition.
variables:
- name: foo
value: contoso # triggers elseif condition
pool:
vmImage: 'ubuntu-latest'
steps:
- script: echo "start"
- ${{ if eq(variables.foo, 'adaptum') }}:
- script: echo "this is adaptum"
- ${{ elseif eq(variables.foo, 'contoso') }}: # true
- script: echo "this is contoso"
- ${{ else }}:
- script: echo "the value is not adaptum or contoso"
You can use the each
keyword to loop through parameters with the object type.
parameters:
- name: listOfStrings
type: object
default:
- one
- two
steps:
- ${{ each value in parameters.listOfStrings }}:
- script: echo ${{ value }}
Additionally, you can iterate through nested elements within an object.
parameters:
- name: listOfFruits
type: object
default:
- fruitName: 'apple'
colors: ['red','green']
- fruitName: 'lemon'
colors: ['yellow']
steps:
- ${{ each fruit in parameters.listOfFruits }} :
- ${{ each fruitColor in fruit.colors}} :
- script: echo ${{ fruit.fruitName}} ${{ fruitColor }}
Expressions can use the dependencies context to reference previous jobs or stages. You can use dependencies to:
The context is called dependencies
for jobs and stages and works much like variables.
If you refer to an output variable from a job in another stage, the context is called stageDependencies
.
If you experience issues with output variables having quote characters ('
or "
) in them, see this troubleshooting guide.
The syntax of referencing output variables with dependencies varies depending on the circumstances. Here's an overview of the most common scenarios. There might be times when alternate syntax also works.
Type
Description
stage to stage dependency (different stages)
Reference an output variable from a previous stage in a job in a different stage in a condition in stages
.
and(succeeded(), eq(stageDependencies.<stage-name>.outputs['<job-name>.<step-name>.<variable-name>'], 'true'))
and(succeeded(), eq(stageDependencies.A.outputs['A1.printvar.shouldrun'], 'true'))
job to job dependency (same stage)
Reference an output variable in a different job in the same stage in stages
.
and(succeeded(), eq(dependencies.<job-name>.outputs['<step-name>.<variable-name>'], 'true'))
and(succeeded(), eq(dependencies.A.outputs['printvar.shouldrun'], 'true'))
Job to stage dependency (different stages)
Reference an output variable in a different stage in a job
.
eq(stageDependencies.<stage-name>.<job-name>.outputs['<step-name>.<variable-name>'], 'true')
eq(stageDependencies.A.A1.outputs['printvar.shouldrun'], 'true')
Stage to stage dependency (deployment job)
Reference output variable in a deployment job in a different stage in stages
.
eq(dependencies.<stage-name>.outputs['<deployment-job-name>.<deployment-job-name>.<step-name>.<variable-name>'], 'true')
eq(dependencies.build.outputs['build_job.build_job.setRunTests.runTests'], 'true')
Stage to stage dependency (deployment job with resource)
Reference an output variable in a deployment job that includes a resource in different stage in stages
.
eq(dependencies.<stage-name>.outputs['<deployment-job-name>.<Deploy_resource-name>.<step-name>.<variable-name>'], 'true')
eq(dependencies.build.outputs['build_job.Deploy_winVM.setRunTests.runTests'], 'true')
There are also different syntaxes for output variables in deployment jobs depending on the deployment strategy. For more information, see Deployment jobs.
Structurally, the dependencies
object is a map of job and stage names to results
and outputs
.
Expressed as JSON, it would look like:
"dependencies": {
"<STAGE_NAME>" : {
"result": "Succeeded|SucceededWithIssues|Skipped|Failed|Canceled",
"outputs": {
"jobName.stepName.variableName": "value"
}
},
"...": {
// another stage
}
}
Note
The following examples use standard pipeline syntax. If you're using deployment pipelines, both variable and conditional variable syntax will differ. For information about the specific syntax to use, see Deployment jobs.
Use this form of dependencies
to map in variables or check conditions at a stage level.
In this example, there are two stages, A and B. Stage A has the condition false
and won't ever run as a result. Stage B runs if the result of Stage A is Succeeded
, SucceededWithIssues
, or Skipped
. Stage B runs because Stage A was skipped.
stages:
- stage: A
condition: false
jobs:
- job: A1
steps:
- script: echo Job A1
- stage: B
condition: in(dependencies.A.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
jobs:
- job: B1
steps:
- script: echo Job B1
Stages can also use output variables from another stage.
In this example, there are also two stages. Stage A includes a job, A1, that sets an output variable shouldrun
to true
. Stage B runs when shouldrun
is true
. Because shouldrun
is true
, Stage B runs.
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
condition: and(succeeded(), eq(dependencies.A.outputs['A1.printvar.shouldrun'], 'true'))
dependsOn: A
jobs:
- job: B1
steps:
- script: echo hello from Stage B
Note
By default, each stage in a pipeline depends on the one just before it in the YAML file.
If you need to refer to a stage that isn't immediately prior to the current one, you can override this automatic default by adding a dependsOn
section to the stage.
At the job level within a single stage, the dependencies
data doesn't contain stage-level information.
"dependencies": {
"<JOB_NAME>": {
"result": "Succeeded|SucceededWithIssues|Skipped|Failed|Canceled",
"outputs": {
"stepName.variableName": "value1"
}
},
"...": {
// another job
}
}
In this example, there are three jobs (a, b, and c). Job a will always be skipped because of condition: false
.
Job b runs because there are no associated conditions.
Job c runs because all of its dependencies either succeed (job b) or are skipped (job a).
jobs:
- job: a
condition: false
steps:
- script: echo Job a
- job: b
steps:
- script: echo Job b
- job: c
dependsOn:
- a
- b
condition: |
and
(
in(dependencies.a.result, 'Succeeded', 'SucceededWithIssues', 'Skipped'),
in(dependencies.b.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
)
steps:
- script: echo Job c
In this example, Job B depends on an output variable from Job A.
jobs:
- job: A
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- job: B
condition: and(succeeded(), eq(dependencies.A.outputs['printvar.shouldrun'], 'true'))
dependsOn: A
steps:
- script: echo hello from B
At the job level, you can also reference outputs from a job in a previous stage.
This requires using the stageDependencies
context.
"stageDependencies": {
"<STAGE_NAME>" : {
"<JOB_NAME>": {
"result": "Succeeded|SucceededWithIssues|Skipped|Failed|Canceled",
"outputs": {
"stepName.variableName": "value"
}
},
"...": {
// another job
}
},
"...": {
// another stage
}
}
In this example, job B1 runs if job A1 is skipped. Job B2 checks the value of the output variable from job A1 to determine whether it should run.
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
dependsOn: A
jobs:
- job: B1
condition: in(stageDependencies.A.A1.result, 'Skipped') # change condition to `Succeeded and stage will be skipped`
steps:
- script: echo hello from Job B1
- job: B2
condition: eq(stageDependencies.A.A1.outputs['printvar.shouldrun'], 'true')
steps:
- script: echo hello from Job B2
If a job depends on a variable defined by a deployment job in a different stage, then the syntax is different. In the following example, the job run_tests
runs if the build_job
deployment job set runTests
to true
. Notice that the key used for the outputs
dictionary is build_job.setRunTests.runTests
.
stages:
- stage: build
jobs:
- deployment: build_job
environment:
name: Production
strategy:
runOnce:
deploy:
steps:
- task: PowerShell@2
name: setRunTests
inputs:
targetType: inline
pwsh: true
script: |
$runTests = "true"
echo "setting runTests: $runTests"
echo "##vso[task.setvariable variable=runTests;isOutput=true]$runTests"
- stage: test
dependsOn:
- 'build'
jobs:
- job: run_tests
condition: eq(stageDependencies.build.build_job.outputs['build_job.setRunTests.runTests'], 'true')
steps:
...
If a stage depends on a variable defined by a deployment job in a different stage, then the syntax is different. In the following example, the stage test
depends on the deployment build_job
setting shouldTest
to true
. Notice that in the condition
of the test
stage, build_job
appears twice.
stages:
- stage: build
jobs:
- deployment: build_job
environment:
name: Production
strategy:
runOnce:
deploy:
steps:
- task: PowerShell@2
name: setRunTests
inputs:
targetType: inline
pwsh: true
script: |
$runTests = "true"
echo "setting runTests: $runTests"
echo "##vso[task.setvariable variable=runTests;isOutput=true]$runTests"
- stage: test
dependsOn:
- 'build'
condition: eq(dependencies.build.outputs['build_job.build_job.setRunTests.runTests'], 'true')
jobs:
- job: A
steps:
- script: echo Hello from job A
In the example above, the condition references an environment and not an environment resource. To reference an environment resource, you'll need to add the environment resource name to the dependencies condition. In the following example, condition references an environment virtual machine resource named vmtest
.
stages:
- stage: build
jobs:
- deployment: build_job
environment:
name: vmtest
resourceName: winVM2
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- task: PowerShell@2
name: setRunTests
inputs:
targetType: inline
pwsh: true
script: |
$runTests = "true"
echo "setting runTests: $runTests"
echo "##vso[task.setvariable variable=runTests;isOutput=true]$runTests"
- stage: test
dependsOn:
- 'build'
condition: eq(dependencies.build.outputs['build_job.Deploy_winVM2.setRunTests.runTests'], 'true')
jobs:
- job: A
steps:
- script: echo Hello from job A
When operating on a collection of items, you can use the *
syntax to apply a filtered array. A filtered array returns all objects/elements regardless their names.
As an example, consider an array of objects named foo
. We want to get an array of the values of the id
property in each object in our array.
[
{ "id": 1, "a": "avalue1"},
{ "id": 2, "a": "avalue2"},
{ "id": 3, "a": "avalue3"}
]
We could do the following:
foo.*.id
This tells the system to operate on foo
as a filtered array and then select the id
property.
This would return:
[ 1, 2, 3 ]
Values in an expression may be converted from one type to another as the expression gets evaluated. When an expression is evaluated, the parameters are coalesced to the relevant data type and then turned back into strings.
For example, in this YAML, the values True
and False
are converted to 1
and 0
when the expression is evaluated.
The function lt()
returns True
when the left parameter is less than the right parameter.
variables:
firstEval: $[lt(False, True)] # 0 vs. 1, True
secondEval: $[lt(True, False)] # 1 vs. 0, False
steps:
- script: echo $(firstEval)
- script: echo $(secondEval)
When you use the eq()
expression for evaluating equivalence, values are implicitly converted to numbers (false
to 0
and true
to 1
).
variables:
trueAsNumber: $[eq('true', true)] # 1 vs. 1, True
falseAsNumber: $[eq('false', true)] # 0 vs. 1, False
steps:
- script: echo $(trueAsNumber)
- script: echo $(falseAsNumber)
In this next example, the values variables.emptyString
and the empty string both evaluate as empty strings.
The function coalesce()
evaluates the parameters in order, and returns the first value that doesn't equal null or empty-string.
variables:
coalesceLiteral: $[coalesce(variables.emptyString, '', 'literal value')]
steps:
- script: echo $(coalesceLiteral) # outputs literal value
Detailed conversion rules are listed further below.
From / To | Boolean | Null | Number | String | Version |
---|---|---|---|---|---|
Boolean | - | - | Yes | Yes | - |
Null | Yes | - | Yes | Yes | - |
Number | Yes | - | - | Yes | Partial |
String | Yes | Partial | Partial | - | Partial |
Version | Yes | - | - | Yes | - |
To number:
False
→ 0
True
→ 1
To string:
False
→ 'False'
True
→ 'True'
False
0
''
(the empty string)0
→ False
, any other number → True
''
(the empty string) → False
, any other string → True
''
(the empty string) → Null
, any other string not convertible''
(the empty string) → 0, otherwise, runs C#'s Int32.TryParse
using InvariantCulture and the following rules: AllowDecimalPoint | AllowLeadingSign | AllowLeadingWhite | AllowThousands | AllowTrailingWhite. If TryParse
fails, then it's not convertible.Version.TryParse
. Must contain Major and Minor component at minimum. If TryParse
fails, then it's not convertible.True
You can customize your Pipeline with a script that includes an expression. For example, this snippet takes the BUILD_BUILDNUMBER
variable and splits it with Bash. This script outputs two new variables, $MAJOR_RUN
and $MINOR_RUN
, for the major and minor run numbers.
The two variables are then used to create two pipeline variables, $major
and $minor
with task.setvariable. These variables are available to downstream steps. To share variables across pipelines see Variable groups.
steps:
- bash: |
MAJOR_RUN=$(echo $BUILD_BUILDNUMBER | cut -d '.' -f1)
echo "This is the major run number: $MAJOR_RUN"
echo "##vso[task.setvariable variable=major]$MAJOR_RUN"
MINOR_RUN=$(echo $BUILD_BUILDNUMBER | cut -d '.' -f2)
echo "This is the minor run number: $MINOR_RUN"
echo "##vso[task.setvariable variable=minor]$MINOR_RUN"
- bash: echo "My pipeline variable for major run is $(major)"
- bash: echo "My pipeline variable for minor run is $(minor)"
Training
Module
Create a multistage pipeline by using Azure Pipelines - Training
Design and create a realistic release pipeline that promotes changes to various testing and staging environments.