Events
Power BI DataViz World Championships
14 Feb, 16 - 31 Mar, 16
With 4 chances to enter, you could win a conference package and make it to the LIVE Grand Finale in Las Vegas
Learn moreThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
With Microsoft OneLake integration for semantic models, data imported into model tables can also be automatically written to Delta tables in OneLake. The Delta format is the unified table format across all compute engines in Microsoft Fabric. OneLake integration exports the data with all key performance features enabled to provide more seamless data access with higher performance.
Data scientists, database analysts, app developers, data engineers, and other data consumers can then access the same data that drives your business intelligence and financial reports in Power BI. T-SQL, Python, Scala, PySpark, Spark SQL, R, and no-code/low-code solutions can all be used to query data from Delta tables.
Before implementing a OneLake integration solution in your organization, be sure to read Considerations and limitations later in this article.
OneLake integration for semantic models is supported on Power BI Premium P and Microsoft Fabric F SKUs only. It's not supported on Power BI Pro, Premium Per User, or Power BI Embedded A/EM SKUs.
Before enabling OneLake integration, you must have:
Model contributor (read, write, explore) permissions are required to access the contents of a model folder and create shortcuts linking to the folder in Lakehouse explorer.
In your semantic model settings, expand OneLake integration, click the slider to On, and then select Apply.
Global and tenant admins can control OneLake integration by using the following setting in the Power BI admin portal:
In order for model import data to be written to a Delta table in OneLake, at least one manual or scheduled refresh for the model must be run. Either run a manual refresh or wait for a scheduled refresh.
Delta tables can be exported in many ways. If your semantic model has XMLA read-write mode enabled, you can export programmatically by using the Tabular Object Model (TOM) and Tabular Model Scripting Language (TMSL).
For example, you can use SQL Server Management Studio (SSMS) to run the following TMSL command:
{
"export": {
"layout": "delta",
"type": "full",
"objects": [
{
"database": "<database name>"
}
]
}
}
After exporting, you can use OneLake file explorer, which integrates OneLake with your Windows File Explorer, to locate Delta table export files..
In OneLake file explorer, right click on the workspace folder, and then select Sync from OneLake.
Use Windows File Explorer to locate your data files. In the workspace folder, look for a subfolder with a name that matches your semantic model and ends with .SemanticModel. The semantic model folder includes every import-mode table in a subfolder that contains the Delta table’s parquet files and log.
By creating shortcuts for your semantic model Lakehouse tables, you can provide quick and easy access to them from other workloads in Fabric.
In Lakehouse Explorer, right-click Tables, and then select New shortcut.
In New shortcut, select Microsoft OneLake.
In Select a data source type, select your semantic model, and then select the tables you want to include.
During preview, currency data types with values larger than 18 decimal points can have some precision loss when exported to Delta files.
During preview, semantic models in BYOK enabled workspaces are not supported.
During preview, shortcut tables built on top of the exported model in Lakehouse can't be queried by using the SQL endpoint.
During preview, Multi-Geo capacities are not yet supported.
During preview, the operation of exporting the model to OneLake is not billed, but compute and storage usage of the exported model on OneLake is billed.
For users with contributor permissions for exported model tables but only viewer permissions for the workspace, a model folder appears in Lakehouse explorer, but an error is returned when selected.
Measures, DirectQuery tables, hybrid tables, calculation group tables, and system managed aggregation tables can't be exported to Delta format tables.
Only a single version of the delta tables are exported and stored on OneLake. Old versions of the delta tables are deleted after a successful export. Other execution engines which use the older but now deleted version of the data can have transient failures.
Events
Power BI DataViz World Championships
14 Feb, 16 - 31 Mar, 16
With 4 chances to enter, you could win a conference package and make it to the LIVE Grand Finale in Las Vegas
Learn moreTraining
Module
Work with Delta Lake tables in Microsoft Fabric - Training
Tables in a Microsoft Fabric lakehouse are based on the Delta Lake technology commonly used in Apache Spark. By using the enhanced capabilities of delta tables, you can create advanced analytics solutions.
Certification
Microsoft Certified: Azure Data Engineer Associate - Certifications
Demonstrate understanding of common data engineering tasks to implement and manage data engineering workloads on Microsoft Azure, using a number of Azure services.