While connecting Azure Blog Storage to Azure Databricks getting Error : Operation failed

Bishnu Baliyase 130 Reputation points
2024-03-04T11:23:51.8333333+00:00

Trying to mount Blob container to Azure Databricks using Service Principal but end up with the below error:

ExecutionError: An error occurred while calling o321.mount. : Operation failed: "This request is not authorized to perform this operation.", 403, HEAD, https://abcdatalake.dfs.core.windows.net/abcd/?upn=false&action=getAccessControl&timeout=90 at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:200).

I am using:

  1. Storage V2
  2. "Storage Blob Data Contributor" access to Service Principal
  3. Valid Secret Key

Thank you in advance

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,795 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,164 questions
{count} votes

Accepted answer
  1. Amrinder Singh 5,155 Reputation points Microsoft Employee
    2024-03-06T06:30:45.78+00:00

    Thanks @Bishnu Baliyase - Thanks for testing it out and glad it help with the isolation. This points that probably the IP when hitting the storage might be getting changed which isn't in the whitelisted one, hence the failure.

    Yes you can create a VNET and then control the access via PE. Only thing we need to ensure is that machine connecting is part of the VNET associated.

    https://learn.microsoft.com/en-us/azure/storage/common/storage-network-security?toc=%2Fazure%2Fstorage%2Fblobs%2Ftoc.json&bc=%2Fazure%2Fstorage%2Fblobs%2Fbreadcrumb%2Ftoc.json&tabs=azure-portal

    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    1 person found this answer helpful.
    0 comments No comments

1 additional answer

Sort by: Most helpful
  1. Nehruji R 7,556 Reputation points Microsoft Vendor
    2024-03-05T09:13:16.94+00:00

    Hello Bishnu Baliyase,

    Greetings! Welcome to Microsoft Q&A Forum.

    I understand that when you try mounting Blob container to Azure Databricks using Service Principal getting the error as operation failed. The error message you’re encountering during the mount operation indicates an authorization issue and below are some possible ways to overcome this issue,

    1.Sometimes, the filesystem initialization during mount can cause issues. Try adding the following entry to your configuration dictionary,

    "fs.azure.createRemoteFileSystemDuringInitialization": "false"

    This setting prevents Databricks from initializing the filesystem during mount, which might resolve the issue.

    2.Post making the changes in the mount configuration, please unmount the existing mount point by,

    dbutils.fs.unmount("/mnt/<mount-name>")

    and replace <mount-name> with the actual name of your mount.

    After unmounting, run the following command to refresh mounts on other running clusters: dbutils.fs.refreshMounts() to propagate the updates to all clusters.

    3.Also ensure to assign the Storage Blob Data Contributor role to the service principal.

    refer - https://learn.microsoft.com/en-us/azure/databricks/dbfs/mounts, Sourced from MSDN – Azure Databricks

    For more details, refer “Tutorial: Azure Data Lake Storage Gen2, Azure Databricks & Spark”.

    Similar Q&A thread for reference - https://learn.microsoft.com/en-us/answers/questions/1443327/shaded-databricks-org-apache-hadoop-fs-azure-azure, https://learn.microsoft.com/en-us/answers/questions/25915/azure-databricks-throwing-403-error

    Hope this answer helps! Please let us know if you have any further queries. I’m happy to assist you further.

    1 person found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.