Create content to migrate to Power BI

This article describes Stage 4, which is concerned with creating and validating content when migrating to Power BI.

Image showing the stages of a Power BI migration. Stage 4 is emphasized for this article.


For a complete explanation of the above graphic, see Power BI migration overview.

The focus of Stage 4 is performing the actual work to convert the proof of concept (POC) to a production-ready solution.

The output from this stage is a Power BI solution that has been validated in a development workspace and is ready for deployment to production.


Most of the topics discussed in this article also apply to a standard Power BI implementation project.

Create the production solution

At this juncture, the same person who performed the POC may carry on with producing the production-ready Power BI solution. Or, someone different may be involved. If timelines are not jeopardized, it's great to get people involved who will be responsible for Power BI development in the future. This way, they can actively learn.


Reuse as much of the work from the POC as possible.

Develop new import dataset

You may choose to create a new Import dataset when an existing Power BI dataset doesn't already exist to meet your needs, or if it can't be enhanced to meet your needs.

Ideally, from the very beginning, consider decoupling the development work for data and reports. Decoupling data and reports will facilitate the separation of work, and permissions, when different people are responsible for data modeling and reports. It makes for a more scalable approach and encourages data reusability.

The essential activities related to development of an Import dataset include:


If you have different development/test/production environments, consider parameterizing data sources. It will make deployment, described in Stage 5, significantly easier.

Develop new reports and dashboards

The essential activities related to development of a Power BI report or dashboard include:

  • Decide on using a Live Connection to an existing data model, or creating a new data model
  • When creating a new data model, decide on the data storage mode for model tables (Import, DirectQuery, or Composite).
  • Decide on the best data visualization tool to meet requirements: Power BI Desktop, Paginated Report Builder, or Excel.
  • Decide on the best visuals to tell the story the report needs to tell, and to address the questions the report needs to answer.
  • Ensure all visuals present clear, concise, and business-friendly terminology.
  • Address interactivity requirements.
  • When using Live Connection, add report-level measures.
  • Create a dashboard in the Power BI service, especially when consumers want an easy way to monitor key metrics.


Many of these decisions will have been made in earlier stages of planning or in the technical POC.

Validate the solution

There are four main aspects to validation of a Power BI solution:

  1. Data accuracy
  2. Security
  3. Functionality
  4. Performance

Validate data accuracy

As a one-time effort during the migration, you'll need to ensure the data in the new report matches what's displayed in the legacy report. Or—if there's a difference—be able to explain why. It's more common than you might think to find an error in the legacy solution that gets resolved in the new solution.

As part of ongoing data validation efforts, the new report will typically need to be cross-checked with the original source system. Ideally, this validation occurs in a repeatable way every time you publish a report change.

Validate security

When validating security, there are two primary aspects to consider:

  • Data permissions
  • Access to datasets, reports, and dashboards

In an Import dataset, data permissions are applied by defining row-level security (RLS). It's also possible that data permissions are enforced by the source system when using DirectQuery storage mode (possibly with single sign-on).

The main ways to grant access to Power BI content are:


We recommend training content authors on how to manage security effectively. It's also important to have robust testing, auditing and monitoring in place.

Validate functionality

It's the time to double-check dataset details like field names, formatting, sorting, and default summarization behavior. Interactive report features, such as slicers, drill-down actions, drillthrough actions, expressions, buttons, or bookmarks, should all be verified, too.

During the development process, the Power BI solution should be published to a development workspace in the Power BI service on a regular basis. Verify all functionality works as expected in the service, such as the rendering of custom visuals. It's also a good time to do further testing. Test scheduled refresh, Q&A, and how reports and dashboards look on a mobile device.

Validate performance

Performance of the Power BI solution is important for consumer experience. Most reports should present visuals in under 10 seconds. If you have reports that take longer to load, pause and reconsider what may be contributing to delays. Report performance should be assessed regularly in the Power BI service, in addition to Power BI Desktop.

Many performance issues arise from substandard DAX (Data Analysis eXpressions), poor dataset design, or suboptimal report design (for instance, trying to render too many visuals on a single page). Technical environment issues, such as the network, an overloaded data gateway, or how a Premium capacity is configured can also contribute to performance issues. For more information, see the Optimization guide for Power BI and Troubleshoot report performance in Power BI.

Document the solution

There are two main types of documentation that are useful for a Power BI solution:

  • Dataset documentation
  • Report documentation

Documentation can be stored wherever it's most easily accessed by the target audience. Common options include:

  • Within a SharePoint site: A SharePoint site may exist for your Center of Excellence or an internal Power BI community site.
  • Within an app: URLs may be configured when publishing a Power BI app to direct the consumer to more information.
  • Within individual Power BI Desktop files: Model elements, like tables and columns, can define a description. These descriptions appear as tooltips in the Fields pane when authoring reports.


If you create a site to serve as a hub for Power BI-related documentation, consider customizing the Get Help menu with its URL location.

Create dataset documentation

Dataset documentation is targeted at users who will be managing the dataset in the future. It's useful to include:

  • Design decisions made and reasons why.
  • Who owns, maintains, and certifies datasets.
  • Data refresh requirements.
  • Custom business rules defined in datasets.
  • Specific dataset security or data privacy requirements.
  • Future maintenance needs.
  • Known open issues or deferred backlog items.

You may also elect to create a change log that summarizes the most important changes that have happened to the dataset over time.

Create report documentation

Report documentation, which is typically structured as a walk-through targeted at report consumers, can help consumers get more value from your reports and dashboards. A short video tutorial often works well.

You may also choose to include additional report documentation on a hidden page of your report. It could include design decisions and a change log.

Next steps

In the next article in this Power BI migration series, learn about stage 5, which is concerned with deploying, supporting, and monitoring content when migrating to Power BI.

Other helpful resources include:

Experienced Power BI partners are available to help your organization succeed with the migration process. To engage a Power BI partner, visit the Power BI partner portal.