Azure Data Factory - Technical Bug when ingesting data from on-premise File System to Azure File Storage

Manas Agarwal 11 Reputation points
2020-07-27T13:02:31.163+00:00

Hi Team,

I was working on a pipeline to move data from on-premise File System to Azure File Storage. I found a strange bug which is causing error when trying to create a Azure File Storage linked service using UI but it is working smoothly when creating the same linked service using JSON objects.

Azure File Storage is creating successfully. Test connection is also working fine but when executing the pipeline then it is showing a Drive Path instead of Azure File Share path and throwing an error that you don't have permission on that path.

I don't know why it is showing a Drive Path? Why it is trying to write data into a drive path instead of Azure File Storage Path?

Same issue is not occurring when I'm creating the linked service using JSON option. It is picking up the Azure File Storage path properly and ingesting the file successfully.

It seems like a bug in the Linked Service Creation UI for Azure File Storage. I have attached the error message for your reference-

Error details.

{ "errorCode": "2200", "message": "ErrorCode=UserErrorPermissionDeniedOnCloudIR,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Access 'D:\folder_test\sub_folder' is not allowed on Azure integrate runtime.,Source=Microsoft.DataTransfer.ClientLibrary,'", "failureType": "UserError", "target": "Sample_Copy", "details": [] }

Kindly, resolve this technical bug of Azure Data Factory.

Thanks,

Manas

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,701 questions
{count} vote

1 answer

Sort by: Most helpful
  1. KranthiPakala-MSFT 46,492 Reputation points Microsoft Employee
    2020-09-15T00:17:48.957+00:00

    Hi All,

    Update from SR on this issue: We have received update from product team and as per the update they have upgraded the azure file storage connector recently.

    So it is suggested to please create a new azure file storage linked service and trigger a copy with the new linked service to avoid the issue.

    And there has been a bug identified with above suggestion that it is working with small files but when trying to copy large files like 1GB from On prem to Azure File Storage, then the following error is triggered:

    { "errorCode": "2200", "message": "Failure happened on 'Sink' side. ErrorCode=AzureFileOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Azure File operation Failed. Path: xxxxx/xxxxx/xxxxx.txt. ErrorMessage: Error Message: The request body is too large and exceeds the maximum permissible limit. (ErrorCode: 413, Detail: The request body is too large and exceeds the maximum permissible limit., RequestId: 6e83fa0b-301a-0009-1c42-803b39000000).,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.Azure.Storage.StorageException,Message=The request body is too large and exceeds the maximum permissible limit.,Source=Microsoft.Azure.Storage.Common,'", "failureType": "UserError", "target": "Copy data1", "details": [] }
    

    As a workaround until the fix is applied, product team suggested to use staging copy with Blob as staging data store.

    The ETA for the fix is end of Sep 2020 (Note: This is tentative date).

    Hope the above info helps. Apologies for any inconvenience caused because of this issue.

    Thank you for your patience.


    Please do consider to click on "Accept Answer" and "Upvote" on the post that helps you, as it can be beneficial to other community members.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.