ADF Dataflow fails when triggered but succeeds when debugging

Veera Mummidi 25 Reputation points
2024-01-17T22:12:20.5766667+00:00

H, when i run the pipeline in debug mode data flow is running succeed in adf but when trigger the pipeline it is throwing below error.

Job failed due to reason: at Source 'source1': This request is not authorized to perform this operation. When using Managed Identity(MI)/Service Principal(SP) authentication
	1. For source: In Storage Explorer, grant the MI/SP at least Execute permission for ALL upstream folders and the file system, along with Read permission for the files to copy. Alternatively, in Access control (IAM), grant the MI/SP at least the Storage Blob Data Reader role.
	2. For sink: In Storage Explorer, grant the MI/SP at least Execute permission for ALL upstream folders and the file system, along with Write permission for the sink folder. Alternatively, in Access control (IAM), grant the MI/SP at least the Storage Blob Data Contributor role.
Also please ensure that the network firewall settings in the storage account are configured correctly as turning on firewall rules for your storage account blocks incoming requests for data by default, unless the requests originate from a service operating within an Azure Virtual Network (VNet
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,646 questions
{count} vote

Accepted answer
  1. PRADEEPCHEEKATLA 90,651 Reputation points Moderator
    2024-01-18T02:06:22.6866667+00:00

    Veera Mummidi - Thanks for the question and using MS Q&A platform.

    It looks like the error message is related to the permissions of the Managed Identity (MI) or Service Principal (SP) used for authentication in Azure Data Factory. The error message suggests that the MI/SP does not have the necessary permissions to access the source and sink data stores. When you run the pipeline in debug mode, it uses your own credentials to access the data stores, which is why it succeeds. However, when you trigger the pipeline, it uses the MI/SP credentials, which may not have the necessary permissions.

    To resolve this issue, you can try the following steps:

    • In Storage Explorer, grant the MI/SP at least Execute permission for ALL upstream folders and the file system, along with Read permission for the files to copy. Alternatively, in Access control (IAM), grant the MI/SP at least the Storage Blob Data Reader role.
    • In Storage Explorer, grant the MI/SP at least Execute permission for ALL upstream folders and the file system, along with Write permission for the sink folder. Alternatively, in Access control (IAM), grant the MI/SP at least the Storage Blob Data Contributor role.
    • Ensure that the network firewall settings in the storage account are configured correctly. Turning on firewall rules for your storage account blocks incoming requests for data by default, unless the requests originate from a service operating within an Azure Virtual Network (VNet).

    To resolve the issue, make sure you have assigned proper permissions on the Storage account.

    • As source: In Storage Explorer, grant at least Execute permission for ALL upstream folders and the file system, along with Read permission for the files to copy. Alternatively, in Access control (IAM), grant at least the Storage Blob Data Reader role.

    As per the repro, I had created a User Assigned Managed Identity named chepraUAMI.

    Error: When I tried to to access the storage account without permissions:

    268141-image.png

    Make sure you have Storage Blob Data Contributor permission on the User Assigned Managed Identity.

    268133-image.png

    Success After granting the permissions, able to successfully connected via User Assigned Managed Identity.

    268124-image.png

    For more details, refer to ADF - User-assigned managed identity authentication and Support for user-assigned managed identity in Azure Data Factory.

    If the issue persists, you may need to recreate the MI/SP with the necessary permissions and update the credentials in Azure Data Factory.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.