Share via

Pipeline run failed although checked connections during creation process worked

JAD_B 1 Reputation point
Aug 10, 2020, 8:08 AM

I built a pipeline with Data Factory, reading data from a MongoDB and storing the data in a Data Storage Lake Gen2. When checking the connection to source and target both connections worked without any problems. But checking the "File Path" connection concerning the sink, I am resceiving and authorization error. The connections is set up with a Managed Identity. Moreover, I added the resource of the Data Factory to the data storage.

When executing the pipeline the run fails and I am receiving the following error:

Operation on target Copy_36v failed: Failure happened on 'Sink' side. ErrorCode=AdlsGen2OperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: Operation returned an invalid status code 'Forbidden'. Account: 'xxxxx'. FileSystem: 'mongodbleads'. Path: 'output/data_62........txt'. ErrorCode: 'AuthorizationPermissionMismatch'. Message: 'This request is not authorized to perform this operation using this permission.'.
...
'Forbidden',Source=,''Type=Microsoft.Azure.Storage.Data.Models.ErrorSchemaException,Message=Operation returned an invalid status code 'Forbidden',Source=Microsoft.DataTransfer.ClientLibrary,'

Any help would be appreciated. It seems that the Data Factory still can't totally access the Data Storage.

Do I manually have to add a Blob Container at my target Data Lake Gen2?

Thanks!

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,553 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,443 questions
{count} votes

3 answers

Sort by: Most helpful
  1. Vaibhav Chaudhari 38,901 Reputation points
    Aug 10, 2020, 12:43 PM

    Ensure Storage blob data contributor role is given to ADF MI ID. See second point in below document -

    connector-azure-data-lake-storage

    ===============================================

    If the response helped, do "Accept Answer" and upvote it -- Vaibhav

    1 person found this answer helpful.
    0 comments No comments

  2. Gurvinder Kandhola 46 Reputation points
    Jun 18, 2021, 7:45 PM

    Can any one please help on this, we are stuck on this and this is in production.

    We are getting this error message in ADF while writing to the BLOB (ADLS Gen2) storage.(
    "Failure happened on 'Sink' side. Error Code=AdlsGen2OperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: Operation returned an invalid status code 'Forbidden'.Account ")
    While using the selected networks option under Networking in Storage Account. We have given the Storage blob data contributor Role to ADF in the Storage Account. This is working fine with All networks, but not with the Private End Point. We had created a private end point and have approved the permissions to the request from ADF , which is generated while creating the end point in data factory.

    107192-image.png


  3. woodymoo 321 Reputation points
    Nov 30, 2022, 11:04 PM

    @Divya Sahu

    I have got same error 2200. I tried to set up a pipeline to transfer data from sql server in vm to snowflake. and failed on first step; store the csv file into storage account by IntegrationRuntimeEngine...
    And I had fixed the issue by 2 steps:

    1. In storage account access management page, assign Storage blob data contributor role to Management Identity "Synapse service".
      265903-pasted-graphic-12.png
    2. I found I forget to add "create" and "write" permission on storage account SAS ticket, and I regenerated one, and updated the link service to the storage account in Synapse.
      265839-pasted-graphic-11.png

    Finally, Succeed:
    265838-image.png


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.