Deployment pipelines best practices
This article provides guidance for business intelligence (BI) creators who are managing their content throughout its lifecycle. The article focuses on the use of deployment pipelines as a BI content lifecycle management tool.
The article is divided into four sections:
Content preparation - Prepare your content for lifecycle management.
Development - Learn about the best ways of creating content in the deployment pipelines development stage.
Test - Understand how to use the deployment pipelines test stage to test your environment.
Production - Utilize the deployment pipelines production stage when making your content available for consumption.
Content preparation
Prepare your content for on-going management throughout its lifecycle. Review the information in this section before you:
Release content to production.
Start using a deployment pipeline for a specific workspace.
Publish your work.
Treat each workspace as a complete package of analytics
Ideally, a workspace should contain a complete view of one aspect (such as department, business unit, project, or vertical) in your organization. Treating each workspace as a complete package makes it easier to manage permissions for different users, and allows you to control content releases for the entire workspace according to a planned schedule.
If you're using centralized datasets that are used across the organization, we recommend that you create two types of workspaces:
Modeling and data workspaces - Workspaces that contain all the centralized datasets
Reporting workspaces - Workspaces that contain all dependent reports and dashboards
Plan your permission model
A deployment pipeline is a Power BI object with its own permissions. In addition, the pipeline contains workspaces that have their own permissions.
To implement a secure and easy workflow, plan who gets access to each part of the pipeline. Some of the considerations to take into account are:
Who should have access to the pipeline?
Which operations should users with pipeline access be able to perform in each stage?
Who's reviewing content in the test stage?
Should the test stage reviewers have access to the pipeline?
Who oversees deployment to the production stage?
Which workspace are you assigning?
Which stage are you assigning your workspace to?
Do you need to make changes to the permissions of the workspace you're assigning?
Connect different stages to different databases
A production database should always be stable and available. It's better not to overload it with queries generated by BI creators for their development or test datasets. Build separate databases for development and testing in order to protect production data, and not overload the development database with the entire volume of production data.
Note
If your organization uses shared centralized datasets, you can skip this recommendation.
Use parameters in your model
As you can't edit datasets data sources in Power BI service, we recommend using parameters to store connection details such as instance names and database names. By using parameters instead of static connection strings, you can manage the connections through the Power BI service web portal, or use APIs, at a later stage.
In deployment pipelines, you can configure parameter rules to set different values for each deployment stage. You can also set rules for paginated reports.
If you don’t use parameters for your connection string, you can define data source rules to specify a connection string for a given dataset. However, rules aren't supported in deployment pipelines for all data sources. To verify that you can configure rules for your data source, see deployment rules limitations.
Parameters also have other uses, such as making changes to queries, filters, and the text displayed in the report.
Development
This section provides guidance for working with the deployment pipelines development stage.
Use Power BI Desktop to edit your reports and datasets
Consider Power BI Desktop as your local development environment. Power BI Desktop allows you to try, explore, and review updates to your reports and datasets. Once the work is done, you can upload the new version to the development stage. We recommend editing .pbix files in the Desktop (and not in Power BI service) for the following reasons:
It's easier to collaborate with fellow creators on the same .pbix file, if all changes are done with the same tool.
The process of making online changes, downloading the .pbix file, and then uploading it again creates reports and datasets duplication.
You can use version control to keep your .pbix files up to date.
Version control for .pbix files
If you want to manage the version history of your reports and datasets, use Power BI's auto-sync with OneDrive. Auto-sync keeps your files updated with the latest version and enable you to retrieve older versions if needed.
Note
Synchronize with OneDrive (or any other repository) only with the .pbix files in the deployment pipeline's development stage. Syncing .pbix files into the deployment pipeline's test and production stages causes problems with deploying content across the pipeline.
Separate modeling development from report and dashboard development
For enterprise scale deployments, we recommend that you separate dataset development from the development of reports and dashboards. To promote changes to only a report or a dataset, use the deployment pipelines selective deploy option.
Start by creating a separate .pbix file for datasets and reports in Power BI Desktop. For example, create a dataset .pbix file and upload it to the development stage. Later, the report authors can create a new .pbix only for the report and connect it to the published dataset by using a live connection. This technique allows different creators to separately work on modeling and visualizations, and deploy them to production independently.
Create shared datasets to use this method across workspaces.
Manage your models using XMLA read/write capabilities
When you separate modeling development from report and dashboard development, you can also take advantage of advanced capabilities such as source control, merging diff changes, and automated processes. Make these changes in the development stage so that the final content can be deployed to the test and production stages. This way, all changes go through a unified process with other dependent items before they're deployed to the production stage.
Separate modeling development from visualizations by managing a shared dataset in an external workspace by using XMLA r/w capabilities. The shared dataset can connect to multiple reports in various workspaces that are managed in multiple pipelines.
Test
This section provides guidance for working with the deployment pipelines test stage.
Simulate your production environment
Other than verifying that new reports or dashboards look acceptable, it's important to see how they perform from an end user's perspective. The deployment pipelines test stage allows you to simulate a real production environment for testing purposes.
Make sure that these three factors are addressed in your test environment:
Data volume
Usage volume
A similar capacity as in production
When testing, you can use the same capacity as the production stage. However, using the same capacity can make production unstable during load testing. To avoid unstable production, test using a different capacity similar in resources to the production capacity. To avoid extra costs, use a capacity where you can pay only for the testing time.
Use deployment rules with a real-life data source
If you're using the test stage to simulate real life data usage, it's recommended to separate the development and test data sources. The development database should be relatively small, and the test database should be as similar as possible to the production database. Use data source rules to switch data sources in the test stage.
If you use a production data source in the test stage, it's useful to control the amount of data you import from your data source. You can control the amount of data you import by adding a parameter to your data source query in Power BI Desktop. Use parameter rules to control the amount of imported data or edit the parameter's value. You can also use this approach to avoid overloading your capacity.
Measure performance
When you simulate a production stage, check the report load and interactions to see if the changes you made affect them.
You should also monitor the load on the capacity to catch extreme loads before they reach production.
Note
It's best to monitor capacity loads again after you deploy updates to the production stage.
Check related items
Changes you make to datasets or reports can also affect related times. During testing, verify that your changes don't affect or break the performance of existing items, which can be dependent on the updated ones.
You can easily find the related items by using the workspace lineage view.
Test your app
If you're distributing content to your end users through an app, review the app's new version before it's in production. Since each deployment pipeline stage has its own workspace, you can easily publish and update apps for development and test stages. Publishing and updating apps allows you to test the app from an end user's point of view.
Important
The deployment process doesn't include updating the app content or settings. To apply changes to content or settings, manually update the app in the required pipeline stage.
Production
This section provides guidance to the deployment pipelines production stage.
Manage who can deploy to production
Because deploying to production should be handled carefully, it's good practice to let only specific people manage this sensitive operation. However, you probably want all BI creators for a specific workspace to have access to the pipeline. Use production workspace permissions to manage access permissions.
To deploy content between stages, users need either member or admin permissions for both stages. Make sure that only the people you want to deploy to production have these permissions. Other users can have production workspace contributor or viewer roles. Users with contributor or viewer roles can see content from within the pipeline but can't deploy.
In addition, limit access to the pipeline by only enabling pipeline permissions to users that are part of the content creation process.
Set rules to ensure production stage availability
Deployment rules are a powerful way to ensure the data in production is always connected and available to users. With deployment rules applied, deployments can run while you have the assurance that end users can see the relevant info without disturbance.
Make sure that you set production deployment rules for data sources and parameters defined in the dataset.
In case of major dataset change, refresh the dataset.
Update the production app
Deployment in a pipeline updates the workspace content, but it doesn't update the associated app automatically. If you use an app for content distribution, don't forget to update the app after deploying to production so that end users are immediately able to use the latest version.
Quick fixes to content
Sometimes there are issues in production that require a quick fix. Never upload a new .pbix version directly to the production stage or make an online change in Power BI service. You can't deploy backwards to a previous stage when there's already content in those stages. Furthermore, deploying a fix without testing it first is bad practice. Therefore, always implement the fix in the development stage and push it to the rest of the deployment pipeline stages. Deploying to the development stage allows you to check that the fix works before deploying it to production. Deploying across the pipeline takes only a few minutes.
Next steps
Feedback
Submit and view feedback for