Hello Bishnu Baliyase,
Greetings! Welcome to Microsoft Q&A Forum.
I understand that when you try mounting Blob container to Azure Databricks using Service Principal getting the error as operation failed. The error message you’re encountering during the mount operation indicates an authorization issue and below are some possible ways to overcome this issue,
1.Sometimes, the filesystem initialization during mount can cause issues. Try adding the following entry to your configuration dictionary,
"fs.azure.createRemoteFileSystemDuringInitialization": "false"
This setting prevents Databricks from initializing the filesystem during mount, which might resolve the issue.
2.Post making the changes in the mount configuration, please unmount the existing mount point by,
dbutils.fs.unmount("/mnt/<mount-name>")
and replace <mount-name>
with the actual name of your mount.
After unmounting, run the following command to refresh mounts on other running clusters: dbutils.fs.refreshMounts() to propagate the updates to all clusters.
3.Also ensure to assign the Storage Blob Data Contributor role to the service principal.
refer - https://learn.microsoft.com/en-us/azure/databricks/dbfs/mounts, Sourced from MSDN – Azure Databricks
For more details, refer “Tutorial: Azure Data Lake Storage Gen2, Azure Databricks & Spark”.
Similar Q&A thread for reference - https://learn.microsoft.com/en-us/answers/questions/1443327/shaded-databricks-org-apache-hadoop-fs-azure-azure, https://learn.microsoft.com/en-us/answers/questions/25915/azure-databricks-throwing-403-error
Hope this answer helps! Please let us know if you have any further queries. I’m happy to assist you further.