SQL Server bulk export large Table from one SQL Server to a different SQL Server with WHERE clause ?

techresearch7777777 1,766 Reputation points
2023-02-28T21:44:07.8466667+00:00

Hello we have a large Table source in SQL Server VM_1 that needs to be copied to another different destination in SQL Server VM_2 with a WHERE clause.

What's the recommended way of doing this...SQL Server Management Studio -> Tasks -> Export Data... -> "Write a query to to specify the data to transfer" ...will this have an impact on Transaction Log and if yes on which VM_1 or VM_2 and is this method stable being it could run for hours or anything else that might cause it to fail ?

Or export to a flat file first or something different ?

Thanks in advance.

SQL Server
SQL Server
A family of Microsoft relational database management and analysis systems for e-commerce, line-of-business, and data warehousing solutions.
12,692 questions
0 comments No comments
{count} votes

4 answers

Sort by: Most helpful
  1. Erland Sommarskog 100.9K Reputation points MVP
    2023-02-28T22:43:11.0166667+00:00

    Generally, reading data does not write to the transaction log. But a long-running query could have secondary effects, particularly if you have some sort of snapshot isolation enabled.

    If the this export/import is the only activity going on the server, you may try to do it directly. But if there are other users active, it may be better to export to file first and then import to the file.

    Overall, the question is difficult to answer short and concisely, because there are so many "it depends" here.

    0 comments No comments

  2. Guoxiong 8,126 Reputation points
    2023-02-28T22:47:29.9366667+00:00

    I would try to use bcp Utility to copy the rows to the bcp file and use a format file to bulk import to the detonation table.

    0 comments No comments

  3. Seeya Xi-MSFT 16,436 Reputation points
    2023-03-01T02:08:03.86+00:00

    Hi @techresearch7777777 ,

    Firstly, using the Export Data wizard can cause a significant increase in the transaction log size on both VM_1 and VM_2. This is because the wizard reads data from VM_1 and writes it to VM_2 using a single transaction, which can result in a large transaction log. You can mitigate this by using the "Fast Load" option in the wizard, which minimizes logging during the transfer.

    Secondly, the Export Data wizard may not be the most efficient method for large data sets, especially if the transfer needs to run for hours. In this case, it may be better to use a bulk copy program, such as the bcp utility, to export the data to a flat file and then import it into the destination database. This method can be faster and less resource-intensive, but it does require more manual configuration.

    Ultimately, the best method depends on your specific requirements and the size of the data set. If you have a small to medium-sized data set and want a simple, easy-to-use method, the Export Data wizard can be a good choice. If you have a large data set and need more control over the transfer process, bcp or another bulk copy program(SSIS) may be more suitable.

    Best regards,

    Seeya


    If the answer is the right solution, please click "Accept Answer" and kindly upvote it. If you have extra questions about this answer, please click "Comment".


  4. Erland Sommarskog 100.9K Reputation points MVP
    2023-03-02T20:50:09.99+00:00

    I think Seeya was a little bit too imaginative there. The BULK INSERT command does not take any WHERE clause.

    You can use SELECT * FROM OPENROWSET(BULK) and in this case you can apply a WHERE clause. But obviously, it it's better to apply the WHERE clause when you export the data.

    If you are not acquainted with BCP and the other ways for bulk load and export, I have an article on my web site with all the details you need: https://www.sommarskog.se/bulkload.html.

    0 comments No comments