How to connect to ADLS using spark-shell (not through databricks)

Bhaskar 0 Reputation points

I have downloaded and using spark-shell (built into apache spark binaries) to set up a spark session and read/write to ADLS. I have a storage account set up with secret and have a single sitting in blob storage. I would like to read that using abfss protocol. So, per documentation at, i have set spark configuration as here. spark.conf.set(""+storageAccountName+"","OAuth")
spark.conf.set(""+storageAccountName+"", clientId)
spark.conf.set(""+storageAccountName+"", clientSecret)
spark.conf.set(""+storageAccountName+"", ""+tenantId+"/oauth2/token")"") Where i have the values for storageAccountName, clientId, clientSecret in variables. When i try to read my file from storage val df ="abfss://<<mycontainer>>@<<storageAccountName>>") i get the following error. WARN FileStreamSink: Assume no metadata directory. Error while looking for metadata directory in the path: abfss://<<mycontainer>>@<<storageAccountName>>
Invalid configuration value detected for Did anyone run into this type of a problem? I am guessing that databricks or not, a spark session should be the same. I am not sure what i am missing and appreciate any insights into this problem and how to get around it. Thanks in advance.

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,281 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,283 questions
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA-MSFT 73,886 Reputation points Microsoft Employee

    @Bhaskar - Thanks for the question and using MS Q&A platform.

    It seems like you are trying to read a file from ADLS using spark-shell. Based on the error message you provided, it seems like there is an issue with the configuration values you have set for the storage account.

    Here are a few things you can check to resolve the issue: Make sure that the values you have set for storageAccountName, clientId, clientSecret, and tenantId are correct and correspond to your ADLS account. Check if you have set the correct configuration values for the storage account. You can try setting the configuration values using the following code:

    spark.conf.set("<your-storage-account-name>", "OAuth")
    spark.conf.set("<your-storage-account-name>", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
    spark.conf.set("<your-storage-account-name>", "<your-client-id>")
    spark.conf.set("<your-storage-account-name>", "<your-client-secret>")
    spark.conf.set("<your-storage-account-name>", "<your-tenant-id>/oauth2/token")

    Make sure that you have the correct permissions to access the file in ADLS. You can check the permissions by going to the Azure portal and checking the access control settings for the file.

    For more details, refer to the below links:

    If you have checked all of the above and still face the issue, please let me know and I can help you further.

    Hope this helps. Do let us know if you any further queries.

    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.