Error while using DBUtils command in Databricks notebook to show list of file from data lake storage account

Davendra Kumar 5 Reputation points
2025-07-13T17:02:54.59+00:00

Hi All,

I am getting error while using DBUTILS command from databricks to show list of files from data lake storage account. Below is error screen shot. It is showing error invalid character in host name. What i tried so far.

  • Checked all permission on storage account details after setting up contributor role on blob storage but no help.
  • created new storage account as well with 24 characters name but no help. container name : source and storage account names: mymasterdbstorageaccount

Error screen shot below.

User's image

Please help on this where is gap and what is missing.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
{count} votes

2 answers

Sort by: Most helpful
  1. Smaran Thoomu 32,530 Reputation points Microsoft External Staff Moderator
    2025-07-16T15:13:31.9633333+00:00

    Hi Davendra Kumar,
    Thank you for the detailed follow-up and for sharing your analysis - that’s incredibly helpful!

    You're absolutely right that mixing account key and service principal (OAuth) authentication methods in the same Databricks notebook session can cause conflicts. Running them in separate notebooks or clusters is indeed the correct approach and aligns with Databricks best practices.

    Also, your observation is valid - dbutils.fs.mount() does not support ABFS/ABFSS URIs for ADLS Gen2 when used with OAuth-based (service principal) authentication. In such cases, it's recommended to access the data directly using Spark configs or consider Unity Catalog-based access where available.

    We really appreciate you taking the time to share your resolution. This kind of input is valuable for others facing similar issues.

    Request for Resurvey

    We acknowledge your previous feedback regarding your dissatisfaction with the response rating. If the recent guidance has provided clarity on the scenario, we would greatly appreciate it if you could kindly reconsider your rating and Accept answer on the thread to help others with similar questions.

    Revisiting the survey and updating your feedback - it helps us continue improving the support experience for the community.


    Please feel free to reach out if you have further questions or need help adapting your setup. Always happy to assist!

    1 person found this answer helpful.
    0 comments No comments

  2. Vinodh247 40,066 Reputation points MVP Volunteer Moderator
    2025-07-14T04:23:07.2133333+00:00

    i guess that the path you passed to dbutils.fs.ls() is malformed. Specifically, the abfss:// URI seems to be incorrectly constructed, most likely due to copy/paste.

    dbutils.fs.ls("abfss://******@mymasterdbstorageaccount.dfs.core.windows.net/")

    can you try the above (copy and paste from notepad)?

    additionally, If you are using ABFS/ABFSS directly, make sure the Spark config has the correct credentials set.

    spark.conf.set("fs.azure.account.key.mymasterdbstorageaccount.dfs.core.windows.net", "<your-access-key>")

    i believe you have mounted it already, if not pls check.

    dbutils.fs.mount( source = "abfss://******@mymasterdbstorageaccount.dfs.core.windows.net/", mount_point = "/mnt/source", extra_configs = {"fs.azure.account.key.mymasterdbstorageaccount.dfs.core.windows.net": "<your-access-key>"} )


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.