Azure Data Factory - cannot access share folder

June Zhu 21 Reputation points
2021-06-30T06:19:50.2+00:00

Hi, I want to set up a monitoring job on ADF using Scope, but the job failed with error saying no access to the share folder shares/searchDM/distrib, my job highly relies on this folder because I need to use the SLAPI view and data, could someone help take a look and let me know how to get the access?

https://aad.cosmos09.osdinfra.net/cosmos/shopping.prod/shares/searchDM/distrib/released/SLAPI/SearchLogPageView.view?property=info

110504-image.png

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,348 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,600 questions
Azure Data Lake Analytics
0 comments No comments
{count} votes

Accepted answer
  1. KranthiPakala-MSFT 46,422 Reputation points Microsoft Employee
    2021-06-30T22:12:45.033+00:00

    Hi @June Zhu ,

    Welcome to Microsoft Q&A forum and thanks for reaching out.

    It seems to be an ACL permission issue at the child folder level (distrib).

    If you are using Service Principal authentication then please grant the service principal proper permission. See examples on how permission works in Data Lake Storage Gen2 from Access control lists on files and directories

    • As source: In Storage Explorer, grant at least Execute permission for ALL upstream folders and the file system, along with Read permission for the files to copy. Alternatively, in Access control (IAM), grant at least the Storage Blob Data Reader role.
    • As sink: In Storage Explorer, grant at least Execute permission for ALL upstream folders and the file system, along with Write permission for the sink folder. Alternatively, in Access control (IAM), grant at least the Storage Blob Data Contributor role.

    Note: If you use Data Factory UI to author and the service principal is not set with "Storage Blob Data Reader/Contributor" role in IAM, when doing test connection or browsing/navigating folders, choose "Test connection to file path" or "Browse from specified path", and specify a path with Read + Execute permission to continue.

    If you are using Managed Identity authentication, then please grant the managed identity proper permission. See examples on how permission works in Data Lake Storage Gen2 from Access control lists on files and directories.

    • As source: In Storage Explorer, grant at least Execute permission for ALL upstream folders and the file system, along with Read permission for the files to copy. Alternatively, in Access control (IAM), grant at least the Storage Blob Data Reader role.
    • As sink: In Storage Explorer, grant at least Execute permission for ALL upstream folders and the file system, along with Write permission for the sink folder. Alternatively, in Access control (IAM), grant at least the Storage Blob Data Contributor role.

    Note: If you use Data Factory UI to author and the managed identity is not set with "Storage Blob Data Reader/Contributor" role in IAM, when doing test connection or browsing/navigating folders, choose "Test connection to file path" or "Browse from specified path", and specify a path with Read + Execute permission to continue.

    Hope this info helps. Do let us know how it goes.

    ----------

    Please don’t forget to Accept Answer and Up-Vote wherever the information provided helps you, this can be beneficial to other community members.


0 additional answers

Sort by: Most helpful