@Adini, Ashish Aravind - Thanks for the question and using MS Q&A platform.
The error message you are seeing indicates that the configuration value for "fs.azure.sas.token.provider.type" is invalid. This configuration value is used to specify the type of SAS token provider to use for Azure Storage in Azure Databricks.
Here are some steps you can take to resolve the issue:
Step1: Create the SAS token for the storage account and copy the SAS token.
Step2: You can configure SAS tokens for multiple storage accounts in the same Spark session.
Replace:
-
<storage-account>
with the Azure Storage account name. -
<scope>
with the Azure Databricks secret scope name. -
<storage-account-access-key>
with the name of the key containing the Azure storage account access key.
spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage-account>.dfs.core.windows.net", dbutils.secrets.get(scope="<scope>", key="<sas-token-key>"))
Step3: Access Azure storage account - Once you have properly configured credentials to access your Azure storage container, you can interact with resources in the storage account using URIs. Databricks recommends using the abfss
driver for greater security.
dbutils.fs.ls("abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/<path-to-data>")
As per the repro, I was able to access the storage account using the SAS token.
For more details, refer to Connect to Azure Data Lake Storage Gen2 and Blob Storage - SAS Token
Hope this helps. Do let us know if you any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.