question

Vinay5-0499 avatar image
0 Votes"
Vinay5-0499 asked KranthiPakala-MSFT commented

System.OutOfMemoryException while copying data from Amzon S3 to Azure Blob.

Hi team,

I am getting below error while copying data from Amazon S3 to Azure Blob. The source is XML file and it is zipped and I want to copy the file without unzipping to the target.
The size of the file when zipped is 77MB. I am trying to write it to an Json file in the target.
The integration run time is self hosted and the Max concurrent connection ,DIU is set as Default. I was able to copy smaller files with the same setting.



Failure happened on 'Sink' side. ErrorCode=SystemErrorOutOfMemory,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The available memory of the Integrated Runtime (self-hosted) is too small, please increase your machine memory.,Source=Microsoft.DataTransfer.TransferTask,''Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A task failed with out of memory.,Source=,''Type=System.OutOfMemoryException,Message=Exception of type 'System.OutOfMemoryException' was thrown.,Source=mscorlib,'
Source

Could you please let me know a work around to resolve this issue.


I have one more doubt. If my source from Amazon S3 is XML, can I save this file as XML in my target Azure blob by using copy activity?
I tried to use a XML sink data set ,but it is only accepting Json as target sink while copying XML source from S3

Thank you.

azure-data-factory
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

KranthiPakala-MSFT avatar image
0 Votes"
KranthiPakala-MSFT answered KranthiPakala-MSFT commented

Hi @Vinay5-0499,

Thanks for reaching out. From the error message it is clear that the available memory is less on the SHIR hosted machine when this run was in process.

When you encounter such issues, below are few recommendations:

Also please refer to this troubleshooting guide: Troubleshoot self-hosted integration runtime

Hope this info helps. Do let us know if you have further query.



Please don’t forget to Accept Answer and Up-Vote wherever the information provided helps you, this can be beneficial to other community members.


· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi Kranthi,

Thanks for you reply.

I have changed the source dataset type from XML to Binary and the pipeline completed without any issues.
I have verified the source and target files after copy and it looks good.
Can I use this work around?

Thank you.

0 Votes 0 ·

Hi Vinay5-0499,

Thanks for your response and glad to know that the issue was resolved. If you are not involving any mapping and just copying the file as-is then you should be good to use Binary dataset.

And yes, you can use the workaround.

Do let us know if you have further query.



Please don’t forget to Accept Answer and Up-Vote wherever the information provided helps you, this can be beneficial to other community members.

1 Vote 1 ·