Share via

Data Factory pipeline fails when copying large data set from AWS S3 to Azure

Nicholas Tham 0 Reputation points
2026-02-10T18:42:08.5933333+00:00

I am trying to copy large data set from AWS S3 to Azure using Data Factor pipeline but getting the following error. Could you please help me figure out the problem.

Error code: AzureFileOperationFailed

Failure type: User configuration issue

Details: Failure happened on 'Sink' side. ErrorCode=AzureFileOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Azure File operation Failed. Path: Shared/IT/blah/blah/blah/foo.html. ErrorMessage: Response status code does not indicate success: 500 (Operation could not be completed within the specified time.)..,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.Http.HttpRequestException,Message=Response status code does not indicate success: 500 (Operation could not be completed within the specified time.).,Source=System.Net.Http,'

Azure Data Factory
Azure Data Factory

An Azure service for ingesting, preparing, and transforming data at scale.

{count} votes

1 answer

Sort by: Most helpful
  1. Dillon Silzer 60,816 Reputation points Volunteer Moderator
    2026-02-11T22:30:21.3833333+00:00

    Hi Nicholas,

    It looks like you are trying to use Azure File Storage. I would recommend considering using Azure Blob Storage.

    https://azure.microsoft.com/en-us/products/storage/blobs

    I would also recommend letting everyone know here how large of files we are talking about from AWS.

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.