Events
Power BI DataViz World Championships
14 Feb, 4 pm - 31 Mar, 4 pm
With 4 chances to enter, you could win a conference package and make it to the LIVE Grand Finale in Las Vegas
Learn moreThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
This article describes and explains important concepts about datamarts.
Datamarts provide a semantic layer that is automatically generated and synchronized with the contents of the datamart tables, their structure, and underlying data. This layer is provided in an automatically generated semantic model. This automatic generation and synchronization enables you to further describe the domain of data with things like hierarchies, friendly names, and descriptions. You can also set formatting specific to your locale or business requirements. With datamarts, you can create measures and standardized metrics for reporting. Power BI (and other client tools) can create visuals and provide results for such calculations based on the data in context.
The default Power BI semantic model created from a datamart eliminates the need to connect to a separate semantic model, set up refresh schedules, and manage multiple data elements. Instead, you can build your business logic in a datamart and its data is immediately available in Power BI, enabling the following:
During preview, default semantic model connectivity is available using DirectQuery only. The following image shows how datamarts fit into the process continuum starting with connecting to data, all the way through creating reports.
Default semantic models are different from traditional Power BI semantic models in the following ways:
With Power BI Desktop users can build composite models, enabling you to connect to the datamart’s semantic model and do the following:
Finally, if you don't want to use the default semantic model directly, you can connect to the datamart’s SQL endpoint. For more information, see Create reports using datamarts.
Currently, tables in the datamart are automatically added to the default semantic model. Users can also manually select tables or views from the datamart they want included in the model for more flexibility. Objects that are in the default semantic model are created as a layout in the model view.
The background sync that includes objects (tables and views) waits for the downstream semantic model to not be in use to update the semantic model, honoring bounded staleness. Users can always go and manually pick tables they want or not want in the semantic model.
You can create and modify incremental data refresh, similar to dataflows and semantic model incremental refresh, using the datamart editor. Incremental refresh extends scheduled refresh operations by providing automated partition creation and management for datamart tables that frequently load new and updated data.
For most datamarts, incremental refresh involves one or more tables that contain transaction data that changes often and can grow exponentially, such as a fact table in a relational or star database schema. If you use an incremental refresh policy to partition the table, and refresh only the most recent import partitions, you can significantly reduce the amount of data that must be refreshed.
Incremental refresh and real-time data for datamarts offers the following advantages:
Proactive caching enables automatic import of the underlying data for the default semantic model so you don't need to manage or orchestrate the storage mode. Import mode for the default semantic model provides performance acceleration for the datamart's semantic model by using the fast Vertipaq engine. When you use proactive caching, Power BI changes the storage mode of your model to import, which uses the in-memory engine in Power BI and Analysis Services.
Proactive caching works in the following way: after each refresh, the storage mode for the default semantic model is changed to DirectQuery. Proactive caching builds a side-by-side import model asynchronously and is managed by the datamart, and doesn't affect availability or performance of the datamart. Queries coming in after the default semantic model is complete will use the import model.
Auto-generation of the import model occurs within approximately 10 minutes after no changes are detected in the datamart. The import semantic model changes in the following ways:
Use Deployment Pipelines for changes to ensure the best performance, and to ensure users are using the import model. Using Deployment Pipelines is already a best practice for building datamarts, but doing so ensures you take advantage of the proactive caching more often.
This article provided an overview of important datamart concepts to understand.
The following articles provide more information about datamarts and Power BI:
For more information about dataflows and transforming data, see the following articles:
Events
Power BI DataViz World Championships
14 Feb, 4 pm - 31 Mar, 4 pm
With 4 chances to enter, you could win a conference package and make it to the LIVE Grand Finale in Las Vegas
Learn moreTraining
Module
Manage semantic models in Power BI - Training
With Microsoft Power BI, you can use a single semantic model to build many reports. Reduce your administrative overhead even more with scheduled semantic model refreshes and resolving connectivity errors.
Certification
Microsoft Certified: Power BI Data Analyst Associate - Certifications
Demonstrate methods and best practices that align with business and technical requirements for modeling, visualizing, and analyzing data with Microsoft Power BI.
Documentation
Get started with datamarts (preview) - Power BI
Begin using datamarts with sample data and examples
Analyzing datamarts (preview) - Power BI
Learn how to analyze your datamarts using various tools such as the Datamart editor and SQL Query Editor, and get insights into your data effectively.
Create reports using datamarts (preview) - Power BI
Learn how to create and share reports using datamarts in Power BI, including live connections, composite models, and SQL Endpoints.