SSIS Visual Studio 2017 - How to delete duplicate records if a file is being processed again in foreach loop container

Baker-9022 1 Reputation point

I have an SSIS package that has a foreach loop container with a data flow task inside which loops files inside a folder to store records of text files in a SQL table. I am required to add an "Execute SQL Task" before the data flow task which should deletes records if a file is being processed again to avoid duplicate records. My connection manager is an OLE DB. How would I be able to achieve this?

Any help would be greatly appreciated.

SQL Server Integration Services
SQL Server Integration Services
A Microsoft platform for building enterprise-level data integration and data transformations solutions.
2,522 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Michael Taylor 51,341 Reputation points

    How do you know that record A in the DB comes from file B that you're foreach loop is processing? That is the query you need to write, based upon your rules. Once you have that query put it into your SQL task as the WHERE clause of your DELETE command. Example (where you're using a parameter provided by SSIS task to specify the filename).

    DELETE FROM MyTable WHERE FileName = @filename
    0 comments No comments

  2. ZoeHui-MSFT 35,381 Reputation points

    Hi @Baker-9022 ,

    If the table have duplicate records, you may use TSQL code to delete it via Execute SQL Task.

    FROM MyDuplicateTable  
    FROM MyDuplicateTable  
    GROUP BY DuplicateColumn1, DuplicateColumn2, DuplicateColumn3)  

    Or with this

     delete x from (  
      select *, rn=row_number() over (partition by [name] order by id)  
      from [0422]  
    ) x  
    where rn > 1;  



    If the answer is the right solution, please click "Accept Answer" and kindly upvote it. If you have extra questions about this answer, please click "Comment".

    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.

    0 comments No comments