Py4JError: ('Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mount() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore',)

Gokul Bondre 0 Reputation points
2025-12-03T14:32:08.45+00:00

While mounting storage location i am getting below error

Py4JError: ('Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mount() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore',)

Below is cluster configuration

User's image

I have selected Manual and dedicated Access mode single user and Unity Cat logUser's image

The script is:-

dbutils.fs.mount(
  source = f"abfss://{container_name}@{storage_account_name}.dfs.core.windows.net/",
  mount_point = f"/mnt/{mount_point}",
  extra_configs = configs)
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
{count} votes

2 answers

Sort by: Most helpful
  1. PRADEEPCHEEKATLA 91,496 Reputation points Moderator
    2025-12-03T18:29:51.4566667+00:00

    Gokul Bondre - Thanks for the question and using MS Q&A platform.

    This is an excepted behavior on the Databricks workspace with the unity catalog enabled by default.

    Note: Unity Catalog-enabled clusters, setting dbutils.fs.mount is blocked for security and compliance. Instead, you must use Unity Catalog storage credentials and external locations.

    Unity Catalog (UC) enforces strict access control policies, and traditional mounting techniquesโ€”such as using access keys or the dbutils.fs.mount commandโ€”are not recommended. Best practices for DBFS and Unity Catalog. User's image

    Databricks advises against using DBFS mounts for external data sources when working with Unity Catalog. Instead, it's recommended to use Unity Catalog's external locations/Volumes and storage credentials to manage data access, providing a more secure and governed approach.

    For more details, refer to Create a storage credential for connecting to Azure Data Lake Storage

    Hope this helps. Let me know if you have any further questions or need additional assistance. Also, if these answer your query, do click the "Upvote" and click "Accept the answer" of which might be beneficial to other community members reading this thread.


    ๐˜›๐˜ฐ ๐˜ด๐˜ต๐˜ข๐˜บ ๐˜ช๐˜ฏ๐˜ง๐˜ฐ๐˜ณ๐˜ฎ๐˜ฆ๐˜ฅ ๐˜ข๐˜ฃ๐˜ฐ๐˜ถ๐˜ต ๐˜ต๐˜ฉ๐˜ฆ ๐˜ญ๐˜ข๐˜ต๐˜ฆ๐˜ด๐˜ต ๐˜ถ๐˜ฑ๐˜ฅ๐˜ข๐˜ต๐˜ฆ๐˜ด ๐˜ข๐˜ฏ๐˜ฅ ๐˜ช๐˜ฏ๐˜ด๐˜ช๐˜จ๐˜ฉ๐˜ต๐˜ด ๐˜ฐ๐˜ฏ ๐˜ˆ๐˜ป๐˜ถ๐˜ณ๐˜ฆ ๐˜‹๐˜ข๐˜ต๐˜ข๐˜ฃ๐˜ณ๐˜ช๐˜ค๐˜ฌ๐˜ด, ๐˜ฅ๐˜ข๐˜ต๐˜ข ๐˜ฆ๐˜ฏ๐˜จ๐˜ช๐˜ฏ๐˜ฆ๐˜ฆ๐˜ณ๐˜ช๐˜ฏ๐˜จ, ๐˜ข๐˜ฏ๐˜ฅ Data & AI ๐˜ช๐˜ฏ๐˜ฏ๐˜ฐ๐˜ท๐˜ข๐˜ต๐˜ช๐˜ฐ๐˜ฏ๐˜ด, ๐˜ง๐˜ฐ๐˜ญ๐˜ญ๐˜ฐ๐˜ธ ๐˜ฎ๐˜ฆ ๐˜ฐ๐˜ฏย ๐˜“๐˜ช๐˜ฏ๐˜ฌ๐˜ฆ๐˜ฅ๐˜๐˜ฏ.


  2. Manoj Kumar Boyini 1,250 Reputation points Microsoft External Staff Moderator
    2025-12-05T13:14:24.23+00:00

    Hi Gokul Bondre,

    The error is expected because your workspace and cluster have Unity Catalog enabled. Unity Catalog blocks dbutils.fs.mount() for security and governance reasons, so mounts cannot be created on any UC enabled cluster.

    Instead of using mounts, Databricks requires you to use either:

    1.Storage Credentials + External Locations (recommended) Create a storage credential using your service principal create an external location pointing to your ADLS container access data directly.

    Example

    CREATE STORAGE CREDENTIAL my_cred
    USING (TYPE="SERVICE_PRINCIPAL",
           DIRECTORY_ID="<tenant_id>",
           APPLICATION_ID="<client_id>",
           CLIENT_SECRET="<client_secret>");
    
    CREATE EXTERNAL LOCATION my_location
    URL "abfss://******@gokulsaa.dfs.core.windows.net/"
    WITH (CREDENTIAL my_cred);
    

    Then read/write without mounting:

    df = spark.read.parquet("abfss://******@gokulsaa.dfs.core.windows.net/path")

    2.Unity Catalog Volumes A UC-managed folder-like storage:

    CREATE VOLUME my_catalog.default.my_volume;

    df.write.parquet("/Volumes/my_catalog/default/my_volume/data")

    If you want to use dbutils.fs.mount(), it will only work on a non-UC cluster, not on your current setup.

    Hope this helps, please let us know if you have any questions and concerns.


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.