Copying huge amount of data by using ADF

Samy Abdul 3,366 Reputation points
2021-08-05T16:55:33.307+00:00

Hi All, I am sorry if this question has come up before. Basically, I want to copy huge amount of data that is billion of rows in to ADLS or Blob. Couple of workaround are segregate the data in to smaller blocks and copy it sequentially instead of all at once considering performance perspective. I would really appreciate if you could please let me know what other workarounds could be possible to achieve performance and consistency. Thanks in advance.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,697 questions
0 comments No comments
{count} votes

Accepted answer
  1. ShaikMaheer-MSFT 38,126 Reputation points Microsoft Employee
    2021-08-06T13:00:15.393+00:00

    Hi @Samy Abdul ,

    Thank you for posting your query in Microsoft Q&A Platform.

    Below are few useful recommendations from Microsoft while handling bulk data. Kindly check them. Thank you.

    Kindly go with any of above suggested approaches based on your source and sink types and based on ETL needs.

    Hope this will help. Please let us know if any further queries. Thank you.

    ----------------------------------

    • Please accept an answer if correct. Original posters help the community find answers faster by identifying the correct answer. Here is how.
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification.
    0 comments No comments

0 additional answers

Sort by: Most helpful