Trusted workspace access
Fabric allows you to access firewall-enabled Azure Data Lake Storage (ADLS) Gen2 accounts in a secure manner. Fabric workspaces that have a workspace identity can securely access ADLS Gen2 accounts with public network access enabled from selected virtual networks and IP addresses. You can limit ADLS Gen2 access to specific Fabric workspaces.
Fabric workspaces that access a storage account with trusted workspace access need proper authorization for the request. Authorization is supported with Microsoft Entra credentials for organizational accounts or service principals. To find out more about resource instance rules, see Grant access from Azure resource instances.
To limit and protect access to firewall-enabled storage accounts from certain Fabric workspaces, you can set up resource instance rule to allow access from specific Fabric workspaces.
Note
Trusted workspace access is generally available, but can only be used in F SKU capacities. For information about buying a Fabric subscription, see Buy a Microsoft Fabric subscription. Trusted workspace access is not supported in Trial capacities.
This article shows you how to:
Configure trusted workspace access in an ADLS Gen2 storage account.
Create a OneLake shortcut in a Fabric Lakehouse that connects to a trusted-workspace-access enabled ADLS Gen2 storage account.
Create a data pipeline to connect directly to a firewall-enabled ADLS Gen2 account that has trusted workspace access enabled.
Use the T-SQL COPY statement to ingest data into your Warehouse from a firewall-enabled ADLS Gen2 account that has trusted workspace access enabled.
You can configure specific Fabric workspaces to access your storage account based on their workspace identity. You can create a resource instance rule by deploying an ARM template with a resource instance rule. To create a resource instance rule:
Sign in to the Azure portal and go to Custom deployment.
Choose Build your own template in the editor. For a sample ARM template that creates a resource instance rule, see ARM template sample.
Create the resource instance rule in the editor. When done, choose Review + Create.
On the Basics tab that appears, specify the required project and instance details. When done, choose Review + Create.
On the Review + Create tab that appears, review the summary and then select Create. The rule will be submitted for deployment.
When deployment is complete, you'll be able to go to the resource.
Note
- Resource instance rules for Fabric workspaces can only be created through ARM templates. Creation through the Azure portal is not supported.
- The subscriptionId "00000000-0000-0000-0000-000000000000" must be used for the Fabric workspace resourceId.
- You can get the workspace id for a Fabric workspace through its address bar URL.
Here's an example of a resource instance rule that can be created through ARM template. For a complete example, see ARM template sample.
"resourceAccessRules": [
{ "tenantId": " aaaabbbb-0000-cccc-1111-dddd2222eeee",
"resourceId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourcegroups/Fabric/providers/Microsoft.Fabric/workspaces/aaaa0a0a-bb1b-cc2c-dd3d-eeeeee4e4e4e"
}
]
If you select the trusted service exception for an ADLS Gen2 account that has public network access enabled from selected virtual networks and IP addresses, Fabric workspaces with a workspace identity will be able to access the storage account. When the trusted service exception checkbox is selected, any workspaces in your tenant's Fabric capacities that have a workspace identity can access data stored in the storage account.
This configuration isn't recommended, and support might be discontinued in the future. We recommend that you use resource instance rules to grant access to specific resources.
A Contributor on the storage account (an Azure RBAC role) can configure resource instance rules or trusted service exception.
There are currently three ways to use trusted workspace access to access your data from Fabric in a secure manner:
You can create a new ADLS shortcut in a Fabric Lakehouse to start analyzing your data with Spark, SQL, and Power BI.
You can create a data pipeline that leverages trusted workspace access to directly access a firewall-enabled ADLS Gen2 account.
You can use a T-SQL Copy statement that leverages trusted workspace access to ingest data into a Fabric warehouse.
The following sections show you how to use these methods.
With the workspace identity configured in Fabric, and trusted workspace access enabled in your ADLS Gen2 storage account, you can create OneLake shortcuts to access your data from Fabric. You just create a new ADLS shortcut in a Fabric Lakehouse and you can start analyzing your data with Spark, SQL, and Power BI.
- A Fabric workspace associated with a Fabric capacity. See Workspace identity.
- Create a workspace identity associated with the Fabric workspace.
- The user account or service principal used as the authentication kind in the shortcut should have Azure RBAC roles on the storage account. The principal must have a Storage Blob Data Contributor, Storage Blob Data owner, or Storage Blob Data Reader role at the storage account scope, or a Storage Blob Delegator role at the storage account scope in addition to a Storage Blob Data Reader role at the container scope.
- Configure a resource instance rule for the storage account.
Note
- Preexisting shortcuts in a workspace that meets the prerequisites will automatically start to support trusted service access.
- You must use the DFS URL ID for the storage account. Here's an example:
https://StorageAccountName.dfs.core.windows.net
Start by creating a new shortcut in a Lakehouse.
The New shortcut wizard opens.
Under External sources select Azure Data Lake Storage Gen2.
Provide the URL of the storage account that has been configured with trusted workspace access, and choose a name for the connection. For Authentication kind, choose Organizational account, or Service Principal.
When done, select Next.
Provide the shortcut name and sub path.
When done, select Create.
The lakehouse shortcut is created, and you should be able to preview storage data in the shortcut.
With OneCopy in Fabric, you can access your OneLake shortcuts with trusted access from all Fabric workloads.
Spark: You can use Spark to access data from your OneLake shortcuts. When shortcuts are used in Spark, they appear as folders in OneLake. You just need to reference the folder name to access the data. You can use the OneLake shortcut to storage accounts with trusted workspace access in Spark notebooks.
SQL analytics endpoint: Shortcuts created in the "Tables" section of your lakehouse are also available in the SQL analytics endpoint. You can open the SQL analytics endpoint and query your data just like any other table.
Pipelines: Data pipelines can access managed shortcuts to storage accounts with trusted workspace access. Data pipelines can be used to read from or write to storage accounts through OneLake shortcuts.
Dataflows v2: Dataflows Gen2 can be used to access managed shortcuts to storage accounts with trusted workspace access. Dataflows Gen2 can read from or write to storage accounts through OneLake shortcuts.
Semantic models and reports: The default semantic model associated with the SQL analytics endpoint of a Lakehouse can read managed shortcuts to storage accounts with trusted workspace access. To see the managed tables in the default semantic model, go to the SQL analytics endpoint item, select Reporting, and choose Automatically update semantic model.
You can also create new semantic models that reference table shortcuts to storage accounts with trusted workspace access. Go to the SQL analytics endpoint, select Reporting and choose New semantic model.
You can create reports on top of the default semantic models and custom semantic models.
KQL Database: You can also create OneLake shortcuts to ADLS Gen2 in a KQL database. The steps to create the managed shortcut with trusted workspace access remain the same.
With the workspace identity configured in Fabric and trusted access enabled in your ADLS Gen2 storage account, you can create data pipelines to access your data from Fabric. You can create a new data pipeline to copy data into a Fabric lakehouse and then you can start analyzing your data with Spark, SQL, and Power BI.
- A Fabric workspace associated with a Fabric capacity. See Workspace identity.
- Create a workspace identity associated with the Fabric workspace.
- The user account or service principal used for creating the connection should have Azure RBAC roles on the storage account. The principal must have a Storage Blob Data Contributor, Storage Blob Data owner, or Storage Blob Data Reader role at the storage account scope.
- Configure a resource instance rule for the storage account.
Start by selecting Get Data in a lakehouse.
Select New data pipeline. Provide a name for the pipeline and then select Create.
Choose Azure Data Lake Gen2 as the data source.
Provide the URL of the storage account that has been configured with trusted workspace access, and choose a name for the connection. For Authentication kind, choose Organizational account or Service Principal.
When done, select Next.
Select the file that you need to copy into the lakehouse.
When done, select Next.
On the Review + save screen, select Start data transfer immediately. When done, select Save + Run.
When the pipeline status changes from Queued to Succeeded, go to the lakehouse and verify that the data tables were created.
With the workspace identity configured in Fabric and trusted access enabled in your ADLS Gen2 storage account, you can use the COPY T-SQL statement to ingest data into your Fabric warehouse. Once the data is ingested into the warehouse, then you can start analyzing your data with SQL and Power BI.
- Trusted workspace access is supported for workspaces in any Fabric F SKU capacity.
- You can only use trusted workspace access in OneLake shortcuts, data pipelines, and the T-SQL COPY statement. To securely access storage accounts from Fabric Spark, see Managed private endpoints for Fabric.
- If a workspace with a workspace identity is migrated to a non-Fabric capacity, or to a non-F SKU Fabric capacity, trusted workspace access will stop working after an hour.
- Pre-existing shortcuts created before October 10, 2023 don't support trusted workspace access.
- Connections for trusted workspace access can't be created or modified in Manage connections and gateways.
- Connections to firewall-enabled Storage accounts will have the status Offline in Manage connections and gateways.
- If you reuse connections that support trusted workspace access in Fabric items other than shortcuts and pipelines, or in other workspaces, they might not work.
- Only organizational account or service principal must be used for authentication to storage accounts for trusted workspace access.
- Pipelines can't write to OneLake table shortcuts on storage accounts with trusted workspace access. This is a temporary limitation.
- A maximum of 200 resource instance rules can be configured. For more information, see Azure subscription limits and quotas - Azure Resource Manager.
- Trusted workspace access only works when public access is enabled from selected virtual networks and IP addresses.
- Resource instance rules for Fabric workspaces must be created through ARM templates. Resource instance rules created through the Azure portal UI aren't supported.
- Pre-existing shortcuts in a workspace that meets the prerequisites will automatically start to support trusted service access.
- If your organization has an Entra Conditional access policy for workload identities that includes all service principals, then trusted workspace access won't work. In such instances, you need to exclude specific Fabric workspace identities from the Conditional access policy for workload identities.
- Trusted workspace access is not supported if a service principal is used to create shortcut.
- Trusted workspace access isn't compatible with cross-tenant requests.
If a shortcut in a lakehouse that targets a firewall-protected ADLS Gen2 storage account becomes inaccessible, it might be because the lakehouse has been shared with a user who doesn't have an admin, member, or contributor role in the workspace where the lakehouse resides. This is a known issue. The remedy is not to share the lakehouse with users who don't have an admin, member, or contributor role in the workspace.
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"resources": [
{
"type": "Microsoft.Storage/storageAccounts",
"apiVersion": "2023-01-01",
"name": "<storage account name>",
"id": "/subscriptions/<subscription id of storage account>/resourceGroups/<resource group name>/providers/Microsoft.Storage/storageAccounts/<storage account name>",
"location": "<region>",
"kind": "StorageV2",
"properties": {
"networkAcls": {
"resourceAccessRules": [
{
"tenantId": "<tenantid>",
"resourceId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourcegroups/Fabric/providers/Microsoft.Fabric/workspaces/<workspace-id>"
}]
}
}
}
]
}