Spark config through Databricks

Mxxx 20 Reputation points
2024-11-20T06:56:00.0466667+00:00

Hi, I am trying to access storage account through databricks (thru notebook) with storage access key.

I keep encountering error when I do spark.config.set

spark.conf.set(
        f"fs.azure.account.key.{storage_account_name}.blob.core.windows.net",
        access_key
    )

I tried many alternative code with spark.conf.set, but I keep encountering the same error message

Configuration fs.azure.account.key.{storage_account_name}.blob.core.windows.net is not available. SQLSTATE: 42K0I

May I know what could be the issue?
I am using serverless compute.

Thank you.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,259 questions
0 comments No comments
{count} votes

Accepted answer
  1. Ganesh Gurram 1,825 Reputation points Microsoft Vendor
    2024-11-25T10:29:47.3333333+00:00

    @Mxxx - I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this! Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others ", I'll repost your solution in case you'd like to accept the answer

    Ask: Spark config through Databricks

    Solution: It seems like this is due to using 'serverless compute'. The same code used on personal compute can be run successfully.

    If I missed anything please let me know and I'd be happy to add it to my answer, or feel free to comment below with any additional information. 

    If you have any other questions, please let me know. Thank you again for your time and patience throughout this issue. 

     

    Please don’t forget to Accept Answer and Yes for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members. 

    0 comments No comments

1 additional answer

Sort by: Most helpful
  1. Deepanshukatara-6769 11,545 Reputation points
    2024-11-20T07:01:10.53+00:00

    Hello , Welcome to MS Q&A

    To access a storage account through Databricks using a storage access key, you can set Spark properties in your Databricks notebook or cluster configuration. Here’s how to do it:

    spark.conf.set(
        "fs.azure.account.key.<storage-account>.dfs.core.windows.net",
        dbutils.secrets.get(scope="<scope>", key="<storage-account-access-key>")
    )
    
    
    

    Once configured, you can interact with resources in the storage account using URIs. It is recommended to use the abfss driver for enhanced security.

    References:

    Kindly accept answer if it works

    Please let us know if any questions

    Thanks

    Deepanshu


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.