UNABLE TO use SAS token for my azure storage on azure databricks

Adini, Ashish Aravind 0 Reputation points
2023-04-27T22:04:26.11+00:00

Unable to load SAS token provider class: java.lang.IllegalArgumentException: The configuration value for "fs.azure.sas.token.provider.type" is invalid.java.lang.IllegalArgumentException: The configuration value for "fs.azure.sas.token.provider.type" is invalid.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,211 questions
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA-MSFT 90,146 Reputation points Microsoft Employee
    2023-05-01T06:48:01.8633333+00:00

    @Adini, Ashish Aravind - Thanks for the question and using MS Q&A platform.

    The error message you are seeing indicates that the configuration value for "fs.azure.sas.token.provider.type" is invalid. This configuration value is used to specify the type of SAS token provider to use for Azure Storage in Azure Databricks.

    Here are some steps you can take to resolve the issue:

    Step1: Create the SAS token for the storage account and copy the SAS token.

    User's image

    Step2: You can configure SAS tokens for multiple storage accounts in the same Spark session.

    Replace:

    • <storage-account> with the Azure Storage account name.
    • <scope> with the Azure Databricks secret scope name.
    • <storage-account-access-key> with the name of the key containing the Azure storage account access key.
    spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "SAS")
    spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
    spark.conf.set("fs.azure.sas.fixed.token.<storage-account>.dfs.core.windows.net", dbutils.secrets.get(scope="<scope>", key="<sas-token-key>"))
    

    Step3: Access Azure storage account - Once you have properly configured credentials to access your Azure storage container, you can interact with resources in the storage account using URIs. Databricks recommends using the abfss driver for greater security.

    dbutils.fs.ls("abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/<path-to-data>")
    

    As per the repro, I was able to access the storage account using the SAS token.

    User's image

    For more details, refer to Connect to Azure Data Lake Storage Gen2 and Blob Storage - SAS Token

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.