I am unable to mount containers using databricks and storage gen 2 ?

smriti das 0 Reputation points
2024-06-05T15:57:33.72+00:00

User's image

User's image

what is the issue?

User's image

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,481 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,215 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Nehruji R 8,146 Reputation points Microsoft Vendor
    2024-06-06T12:15:40.04+00:00

    Hello smriti das,

    Greetings! Welcome to Microsoft Q&A Platform.

    As per the error message error. java.io.FileNotFoundException: Operation failed: "The specified filesystem does not exist.", 404, HEAD, https://datalakeprogen2.dfs.core.windows.net....getAccessControl&timeout=90 it clearly says that the specified filesystem name mount_Point doesn't exist.

    Ensure your file system "Owfile" is present in correct storage account as in the error message it is projecting the storage account name as User's imagewhich you trying to mount.

    Note: To resolve this issue, try giving the access path as below in Mount_point instead of calling and then you can mount the filesystem,

    User's image

    refer - https://techcommunity.microsoft.com/t5/azure-paas-blog/mount-adls-gen2-or-blob-storage-in-azure-databricks/ba-p/3802926 for detailed guidance.

    Similar thread for reference - https://learn.microsoft.com/en-us/answers/questions/1510721/not-able-to-create-a-mount-point-using-spn-to-adls,https://learn.microsoft.com/en-us/answers/questions/1376442/mounting-azure-storage-adlsgen2-on-azure-databrick.

    Hope this answer helps! Please let us know if you have any further queries. I’m happy to assist you further.


    Please "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

  2. Luis Arias 7,121 Reputation points
    2024-06-06T12:19:54.16+00:00

    Hi smriti das,

    Your setup looks pretty for me , the issue could be on the network side. If it's a just test environment you can make your storage to public access.

    User's image

    After that you can test list the storage with below command and SAS token:

    spark.conf.set(
        "fs.azure.account.key.{storage_account_name}.dfs.core.windows.net",
        "Add your sas token")
    
    dbutils.fs.ls("abfss://{container_name}@{storage_account_name}.dfs.core.windows.net/")
    
    
    

    Be sure that you storage is ADLS(herarchical namespace enable) to use abfss.

    References:

    Cheers,

    Luis

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.