For some reason I couldn't comment on your post. I created a new app registration, generated a client secret, then gave the app "Contributor" role on the storage account. I then tried using OAuth2:
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": "xxx...",
"fs.azure.account.oauth2.client.secret": "xxx...",
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/xxx.../oauth2/token",
"fs.azure.createRemoteFileSystemDuringInitialization": "true"}
dbutils.fs.mount(
source = "abfss://demo@mystorage.dfs.core.windows.net/",
mount_point = "/mnt/demo",
extra_configs = configs)
This still throws an exeception:
ExecutionError: An error occurred while calling o298.mount.
: Operation failed: "This request is not authorized to perform this operation.", 403, PUT, https://mystorage.dfs.core.windows.net/demo?resource=filesystem, AuthorizationFailure, "This request is not authorized to perform this operation. RequestId:4838ca85-d01f-0025-55d2-194358000000 Time:2022-02-04T14:20:09.0319452Z"
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:241)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsClient.createFilesystem(AbfsClient.java:186)
Considering the other method, the SAS doc page is missing information on how to do the SAS auth in Python.