Connecting to Microsoft OneLake
Microsoft OneLake provides open access to all of your Fabric items through existing Azure Data Lake Storage (ADLS) Gen2 APIs and SDKs. You can access your data in OneLake through any API, SDK, or tool compatible with ADLS Gen2 just by using a OneLake URI instead. You can upload data to a lakehouse through Azure Storage Explorer, or read a delta table through a shortcut from Azure Databricks.
As OneLake is software as a service (SaaS), some operations, such as managing permissions or updating items, must be done through Fabric experiences instead of the ADLS Gen2 APIs. For a full list of changes to these APIs, see OneLake API parity.
URI syntax
Because OneLake exists across your entire Microsoft Fabric tenant, you can refer to anything in your tenant by its workspace, item, and path:
https://onelake.dfs.fabric.microsoft.com/<workspace>/<item>.<itemtype>/<path>/<fileName>
Note
Because you can reuse item names across multiple item types, you must specify the item type in the extension. For example, .lakehouse
for a lakehouse and .datawarehouse
for a warehouse.
OneLake also supports referencing workspaces and items with globally unique identifiers (GUIDs). OneLake assigns GUIDs and GUIDs don't change, even if the workspace or item name changes. You can find the associated GUID for your workspace or item in the URL on the Fabric portal. You must use GUIDs for both the workspace and the item, and don't need the item type.
https://onelake.dfs.fabric.microsoft.com/<workspaceGUID>/<itemGUID>/<path>/<fileName>
When adopting a tool for use over OneLake instead of ADLS Gen2, use the following mapping:
- The account name is always
onelake
. - The container name is your workspace name.
- The data path starts at the item. For example:
/mylakehouse.lakehouse/Files/
.
OneLake also supports the Azure Blob Filesystem driver (ABFS) for more compatibility with ADLS Gen2 and Azure Blob Storage. The ABFS driver uses its own scheme identifier abfs
and a different URI format to address files and directories in ADLS Gen2 accounts. To use this URI format over OneLake, swap workspace for filesystem and include the item and item type.
abfs[s]://<workspace>@onelake.dfs.fabric.microsoft.com/<item>.<itemtype>/<path>/<fileName>
The abfs driver URI doesn't allow special characters, such as spaces, in the workspace name. In these cases, you can reference workspaces and items with the globally unique identifiers (GUIDs) as described earlier in this section.
Authorization
You can authenticate OneLake APIs using Microsoft Entra ID by passing through an authorization header. If a tool supports logging into your Azure account to enable token passthrough, you can select any subscription. OneLake only requires your user token and doesn't care about your Azure subscription.
When calling OneLake via DFS APIs directly, you can authenticate with a bearer token for your Microsoft Entra account. To learn more about requesting and managing bearer tokens for your organization, check out the Microsoft Authentication Library.
For quick, ad-hoc testing of OneLake using direct API calls, here's a simple example using PowerShell to sign in to your Azure account, retrieve a storage-scoped token, and copy it to your clipboard for easy use elsewhere. For more information about retrieving access tokens using PowerShell, see Get-AzAccessToken.
Note
OneLake only supports tokens in the Storage
audience. In the following example, we set the audience through the ResourceTypeName
parameter.
Connect-AzAccount
$testToken = Get-AzAccessToken -ResourceTypeName Storage
# Retrieved token is of string type which you can validate with the "$testToken.Token.GetTypeCode()" command.
$testToken.Token | Set-Clipboard
Data residency
If you use the global endpoint ('https://onelake.dfs.fabric.microsoft.com`) to query data in a region different than your workspace's region, there's a possibility that data could leave your region during the endpoint resolution process. If you're concerned about data residency, using the correct regional endpoint for your workspace ensures your data stays within its current region and doesn't cross any regional boundaries. You can discover the correct regional endpoint by checking the region of the capacity that the workspace is attached to.
OneLake regional endpoints all follow the same format: https://<region>-onelake.dfs.fabric.microsoft.com
. For example, a workspace attached to a capacity in the West US region would be accessible through the regional endpoint https://westus-onelake.dfs.fabric.microsoft.com
.
Common issues
If a tool or package compatible with ADLS Gen2 isn't working over OneLake, the most common issue is URL validation. As OneLake uses a different endpoint (dfs.fabric.microsoft.com
) than ADLS Gen2 (dfs.core.windows.net
), some tools don't recognize the OneLake endpoint and block it. Some tools allow you to use custom endpoints (such as PowerShell). Otherwise, it's often a simple fix to add OneLake's endpoint as a supported endpoint. If you find a URL validation issue or have any other issues connecting to OneLake, let us know.
Samples
Create file
Request | PUT https://onelake.dfs.fabric.microsoft.com/{workspace}/{item}.{itemtype}/Files/sample?resource=file |
---|---|
Headers | Authorization: Bearer <userAADToken> |
Response | ResponseCode: 201 Created Headers: x-ms-version : 2021-06-08 x-ms-request-id : 272526c7-0995-4cc4-b04a-8ea3477bc67b x-ms-content-crc64 : OAJ6r0dQWP0= x-ms-request-server-encrypted : true ETag : 0x8DA58EE365 Body: |