Hi All,
Update from SR on this issue: We have received update from product team and as per the update they have upgraded the azure file storage connector recently.
So it is suggested to please create a new azure file storage linked service and trigger a copy with the new linked service to avoid the issue.
And there has been a bug identified with above suggestion that it is working with small files but when trying to copy large files like 1GB from On prem to Azure File Storage, then the following error is triggered:
{ "errorCode": "2200", "message": "Failure happened on 'Sink' side. ErrorCode=AzureFileOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Azure File operation Failed. Path: xxxxx/xxxxx/xxxxx.txt. ErrorMessage: Error Message: The request body is too large and exceeds the maximum permissible limit. (ErrorCode: 413, Detail: The request body is too large and exceeds the maximum permissible limit., RequestId: 6e83fa0b-301a-0009-1c42-803b39000000).,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.Azure.Storage.StorageException,Message=The request body is too large and exceeds the maximum permissible limit.,Source=Microsoft.Azure.Storage.Common,'", "failureType": "UserError", "target": "Copy data1", "details": [] }
As a workaround until the fix is applied, product team suggested to use staging copy with Blob as staging data store.
The ETA for the fix is end of Sep 2020 (Note: This is tentative date).
Hope the above info helps. Apologies for any inconvenience caused because of this issue.
Thank you for your patience.
Please do consider to click on "Accept Answer" and "Upvote" on the post that helps you, as it can be beneficial to other community members.