Please provide more details on what you are trying to accomplish.
How to do ETL of very large table efficiently?
What is the most efficient way to do the ETL of a very large table? The way we have been doing is taking a lot of time to complete.
3 answers
Sort by: Most helpful
-
-
Lukas Yu -MSFT 5,821 Reputation points
2020-11-30T02:02:54.137+00:00 Hi,
Generally ,to load large size data you could follow : SQL OFFSET FETCH Feature: Loading Large Volumes of Data Using Limited ResourcesAlso, you could see design tips as : 10-tips-to-improve-etl-performance
If the answer is helpful, please click "Accept Answer" and upvote it.
Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.
What can I do if my transaction log is full?--- Hot issues November
How to convert Profiler trace into a SQL Server table -- Hot issues November -
Monalv-MSFT 5,901 Reputation points
2020-11-30T05:48:14.627+00:00 Hi @Surendra Adhikari ,
We can refer to the following steps to improve the performance of ssis package.
1.Use a SQL statement in the source component
2. Get as many rows as you can into a buffer
3. Don’t use the default buffer settings
4. Avoid blocking transformations
5. Don’t use the OLE DB command transformationHope the following links will be helpful:
Integration Services Performance Best Practices – Data Flow Optimization
SQL Server Integration Services SSIS Performance Tuning Techniques
Best Regards,
Mona----------
If the answer is helpful, please click "Accept Answer" and upvote it.
Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.
Hot issues in November--What can I do if my transaction log is full?
Hot issues in November--How to convert Profiler trace into a SQL Server table?