Azure databricks throwing 403 error

Ramya Harinarthini_MSFT 5,306 Reputation points Microsoft Employee
2020-05-08T08:32:51.27+00:00

I am using this command in the notebook

configs = {"fs.azure.account.auth.type": "OAuth", 
           "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider", 
           "fs.azure.account.oauth2.client.id": "XXXXX Application ID in Azure Active Directory", 
           "fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = "secret",key = "Key"), 
           "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/Directory ID/oauth2/token"} 
dbutils.fs.mount( 
  source = "abfss://raw@mydatalake.dfs.core.windows.net/", 
  mount_point = "/mnt/raw", extra_configs = configs) 

dbutils.fs.ls("mnt/raw/") 

But it throws this error:

StatusCode=403 
StatusDescription=This request is not authorized to perform this operation using this permission. 
ErrorCode=AuthorizationPermissionMismatch 

I have granted Service Principal the required permissions (Made it owner) But still it is throwing this error. Looks like this is a bug in the product.

[Note: As we migrate from MSDN, this question has been posted by an Azure Cloud Engineer as a frequently asked question]

Sourced from MSDN – Azure Databricks

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,907 questions
0 comments No comments
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 76,511 Reputation points Microsoft Employee
    2020-05-08T09:03:13.473+00:00

    Welcome to the Microsoft Q&A platform.

    Happy to answer your questions.

    Note: When performing the steps in the Assign the application to a role, make sure to assign the Storage Blob Data Contributor role to the service principal.

    Repro: I have provided owner permission to the service principal and tried to run the “dbutils.fs.ls("mnt/azure/")”, returned same error message as above.

    8045-add1.jpg

    Solution: Now assigned the Storage Blob Data Contributor role to the service principal.

    7983-add2.jpg

    Finally, able to get the output without any error message after assigning Storage Blob Data Contributor role to the service principal.

    7984-add3.jpg

    For more details, refer “Tutorial: Azure Data Lake Storage Gen2, Azure Databricks & Spark”.

    Hope this helps.

    Sourced from MSDN – Azure Databricks


1 additional answer

Sort by: Most helpful
  1. Ishant Kaushik 6 Reputation points
    2021-07-23T22:48:36.723+00:00

    I also faced same issue but later figured out that you need to have only (Storage Blob Data Contributor) Role specified on your data lake for your service principal.
    If you have given only just (Contributor) role it will not work.
    Or both Contributor and Storage Blob Data Contributor it will not work.
    You have to just provide Storage Blob Data Contributor on your data lake gen 2
    117592-image.png

    1 person found this answer helpful.