Get started with deployment pipelines
APPLIES TO: Power BI Desktop Power BI service
This article walks you through the basic settings required for using deployment pipelines in Power BI service. It's recommended to read the deployment pipelines introduction, before you proceed.
You can also complete the Create and manage a Power BI deployment pipeline Learn module, which shows how to create a deployment pipeline.
In a deployment pipeline, one workspace is assigned to each stage. Before you start working with your pipeline in production, review the capacity requirements for the pipeline's workspaces.
Accessing deployment pipelines
You'll be able to access the deployment pipelines feature, if the following conditions are met:
You have one of the following Premium licenses:
You're an admin of a workspace.
You'll also be able to see the deployment pipelines button, if you previously created a pipeline, or if a pipeline was shared with you.
Step 1 - Create a deployment pipeline
You can create a pipeline from the deployment pipelines tab, or from a workspace.
After the pipeline is created, you can share it with other users or delete it. When you share a pipeline with others, the users you share the pipeline with will be given access to the pipeline and will become pipeline admins. Pipeline access enables users to view, share, edit, and delete the pipeline.
Create a pipeline from the deployment pipelines tab
To create a pipeline from the deployment pipelines tab, do the following:
In Power BI service, from the navigation pane, select Deployment pipelines and then select Create pipeline.
In the Create a deployment pipeline dialog box, enter a name and description for the pipeline, and select Create.
Create a pipeline from a workspace
You can create a pipeline from an existing workspace, providing you're the admin of a workspace.
From the workspace, select Create a pipeline.
In the Create a deployment pipeline dialog box, enter a name and description for the pipeline, and select Create.
If the workspace isn't assigned to your organization's Premium capacity, or to your PPU capacity, you'll get a notification to assign it to a capacity.
Step 2 - Assign a workspace
After creating a pipeline, you need to add the content you want to manage to the pipeline. Adding content to the pipeline is done by assigning a workspace to the pipeline stage. You can assign a workspace to any stage.
Follow the instructions in the link to assign a workspace to a pipeline.
If you're creating a pipeline from a workspace, you can skip this stage as the workspace is already selected.
Step 3 - Deploy to an empty stage
Any Pro user that's a member or admin in the source workspace, can deploy content to an empty stage (a stage that doesn't contain content). The workspace must reside on a capacity for the deployment to be completed.
You can also use the deployment pipelines REST APIs to programmatically perform deployments. For more information, see Automate your deployment pipeline using APIs and DevOps.
If you already have a workspace that you'd like to use with a specific stage, instead of deploying you can assign that workspace to the appropriate stage.
When you're deploying content to an empty stage, the relationships between the items are kept. For example, a report that is bound to a dataset in the source stage, will be cloned alongside its dataset, and the clones will be similarly bound in the target workspace.
Once the deployment is complete, refresh the dataset. For more information, see deploying content to an empty stage.
Deployment pipelines offer three options when it comes to deploying your Power BI content:
Deploying all content - Deploy all your Power BI content to the target stage.
Selective deployment - Select which Power BI content to deploy to the target stage.
Backwards deployment - Deploy your content to a previous stage in the pipeline.
After you choose how to deploy your Power BI content, you can Review your deployment and leave a note.
Deploying all content
Select the stage to deploy from and then select the deployment button. The deployment process creates a duplicate workspace in the target stage. This workspace includes all the content existing in the current stage.
To deploy only specific items, select the Show more link, and then select the items you wish to deploy. When clicking the deploy button, only the selected items are deployed to the next stage.
Since dashboards, reports, datasets and dataflows are related and have dependencies, you can use the select related button to check all items that those items are dependent on. For example, if you want to deploy a report to the next stage, clicking the select related button will mark the dataset that the report is connected to, so that both will be deployed at once and the report will not break.
- You can't deploy a Power BI item to the next stage, if the items it's dependent on doesn't exist in the stage you are deploying to. For example, deploying a report without a dataset will fail, unless the dataset already exists in the target stage.
- You might get unexpected results if you choose to deploy a Power BI item without the item it's dependent on. This can happen when a dataset or a dataflow in the target stage, has changed and is no longer identical to the one in the stage you're deploying from.
You can choose to deploy to a previous stage, for example in a scenario where you assign an existing workspace to a production stage and then deploy it backwards, first to the test stage, and then to the development one.
Deploying to a previous stage works only if the previous stage is empty. When deploying to a previous stage, you can't select specific items. All content in the stage will be deployed.
Review your deployment and leave a note
After selecting which content to deploy, a pop-up window lists all the items you're about to deploy. You can review the list and add a note to the deployment. Adding a note is optional, but it is highly recommended as the notes are added to the deployment history. With a note for each deployment, reviewing the history of your pipelines becomes more meaningful.
To leave a note, expand the Add a note option and write your note in the text box. When you're ready to deploy, select Deploy.
Step 4 - Create deployment rules
When you're working in a deployment pipeline, different stages may have different configurations. For example, each stage can have different databases or different query parameters. The development stage might query sample data from the database, while the test and production stages query the entire database.
When you deploy content between pipeline stages, configuring deployment rules enables you to allow changes to content, while keeping some settings intact. For example, if you want a dataset in a production stage to point to a production database, you can define a rule for this. The rule is defined in the production stage, under the appropriate dataset. Once the rule is defined, content deployed from test to production, will inherit the value as defined in the deployment rule, and will always apply as long as the rule is unchanged and valid.
You can configure data source rules and parameter rules. The following table lists the type of Power BI items you can configure rules for, and the type of rule you can configure for each one.
|Data source rule||Parameter rule||Details|
|Dataflow||Use to determine the values of the data sources or parameters for a specific dataflow.|
|Dataset||Use to determine the values of the data sources or parameters for a specific dataset.|
|Datamart||Use to determine the values of the data sources or parameters for a specific datamart.|
|Paginated report||Defined for the data sources of each paginated report. You can use these rules to determine the data sources of the paginated report.|
Data source rules only work when you change data sources from the same type.
Create a deployment rule
To create a deployment rule, follow the steps in this section. After you create all the deployment rules you need, deploy the datasets with the newly created rules from the source stage to the target stage where the rules were created. Your rules will not apply until you deploy the datasets from the source to the target stage.
In the pipeline stage you want to create a deployment rule for, select Deployment rules.
You can set rules to dataflows, datasets, datamarts and paginated reports. To find the Power BI item you want to set a rule for, use the search functionality or filter functionalities.
Select the dataflow, dataset, datamart or paginated report you want to create a rule for.
Select the type of rule you want to create, expand the list, and then select Add rule. There are two types of rules you can create:
Data source rules
From the data source list, select a data source name to be updated. Use one of the following methods to select a value to replace the one from the source stage:
Select from a list.
Select Other and manually add the new data source. You can only change to a data source from the same type.
- Data source rules will be grayed out if you're not the owner of the Power BI item you're creating a rule for, or if your Power BI item doesn't contain any data sources.
- For dataflows, datasets and paginated reports, the data source list is taken from the source pipeline stage.
- You can’t use the same data source in more than one rule.
Parameter rules - Select a parameter from the list of parameters; the current value is shown. Edit the value to the value you want to take effect after each deployment.
Deployment rule limitations
This section lists the limitations for the deployment rules.
To create a deployment rule, you must be the owner of the dataflow, dataset or paginated report you're creating a rule for.
Deployment rules cannot be created in the development stage.
When an item is removed or deleted, its rules are deleted too. These rules cannot be restored.
When you unassign and reassign a workspace to reestablish connections, rules for that workspace are not kept. To use these rules, you'll need to reconfigure them.
Rules for dataflows that have other dataflows as sources, are not supported.
Data source rules for common data model (CDM) folders in a dataflow, are not supported.
Rules for datasets that use dataflows as their source, are not supported.
If the data source defined in a rule is changed or removed from the item it points to in the source stage, the rule will not be valid and the deployment will fail.
If the parameter defined in a rule is changed or removed from the item it points to in the source stage, the rule will not be valid and the deployment will fail.
After you deploy a paginated report with a data source rule, opening the report using the Power BI Report Builder isn't supported.
Parameter rules aren't supported for paginated reports.
Supported data sources for rules
The table below lists the data sources that you can define rules for. For other data sources, we recommend using parameters to configure your data source.
|Data source||Dataset||Dataflow||Paginated report|
|Azure Analysis Services|
|SQL Server Analysis Services (SSAS)|
|Azure SQL Server|
|SapHana (only supported for import mode; not direct query mode)|
Step 5 - Deploy content from one stage to another
Once you have content in a pipeline stage, you can deploy it to the next stage. Deploying content to another stage is usually done after you've performed some actions in the pipeline. For example, made development changes to your content in the development stage, or tested your content in the test stage. A typical workflow for moving content from stage to stage, is development to test, and then test to production. You can learn more about this process, in the deploy content to an existing workspace section.
To deploy content to the next stage in the deployment pipeline, select the deploy button at the bottom of the stage.
When reviewing the test and production stage cards, you can see the last deployment time. This indicates the last time content was deployed to the stage.
Deployment time is useful for establishing when a stage was last updated. It can also be helpful if you want to track time between test and production deployments.
When two sequential stages have content, the content is compared based on the content items metadata. This comparison doesn't include comparing data or refresh time between stages.
To allow a quick visual insight into the differences between two sequential stages, a comparison icon indicator appears between them. The comparison indicator has two states:
Green indicator – The metadata for each content item in both stages, is the same.
Orange indicator - Appears if one of these conditions is met:
- Some of the content items in each stage, were changed or updated (have different metadata).
- There is a difference in the number of items between the stages.
When two sequential stages aren't the same, a compare link appears underneath the orange comparison icon. Clicking the link opens the content item list in both stages in Compare view. Compare view helps you track changes or differences between items, in each pipeline stage. Changed items get one of the following labels:
New – A new item in the source stage. This is an item that doesn't exist in the target stage. After deployment, this item will be cloned to the target stage.
Different – An item that exists both in the source and the target stage, were one of the versions was changed after the last deployment. After deployment, the item in the source stage will overwrite the item in the target stage, regardless of where the change was made.
Datasets with configured deployment rules that haven't been deployed, are also marked as different. This is because deployment rules are not applied until the datasets are deployed from the source stage to the target stage, which includes the configured rules.
Missing from – This label indicates that an item appears in the target stage, but not in the source stage.
Deployment will not impact missing from items.
The comparison is designed to help you understand the difference between the stages. In the comparison display, items are arranged according to the logic listed below.
The source workspace items are ordered alphabetically withing groups, based on their type. The groups are ordered according to the flow of data. For example, first datasets, then reports and lastly dashboards.
The target workspace is ordered according to the source workspace. Each item resides in the same row as its equivalent item in the source stage.
Items that exist in the target workspace but not in the source workspace, are displayed at the bottom of their type group, in alphabetical order.
When you deploy after making changes to content in the source stage, the content you changed in the target stage is overwritten. After clicking deploy, you'll get a warning listing the number of items that will be overwritten.