asma gh 21 Reputation points

hello , please I would like to know how to apply a control rule (exists) on my data indeed I load data from several excel files (file of each week) in a table and sometimes the same row with the same information is in 2 files and I would like to embed a script that will check if this line already exists so it doesn't add it but if it's exists it will not insert the duplicated row , an idea please thank youu

SQL Server Integration Services
SQL Server Integration Services
A Microsoft platform for building enterprise-level data integration and data transformations solutions.
2,508 questions
A Microsoft extension to the ANSI SQL language that includes procedural programming, local variables, and various support functions.
4,595 questions
0 comments No comments
{count} votes

3 answers

Sort by: Most helpful
  1. Vaibhav Chaudhari 38,681 Reputation points

    Implementation depends on your scenario

    Option 1 - Truncate the data first and load data from Excel every time

    Option 2 - Load data from Excel to Staging table and run SP to get only unique records loaded to final table

    Option 3 - After excel load, run SP to remove duplicate, unwanted data from your final table


    Please don't forget to Accept Answer and Up-vote if the response helped -- Vaibhav

    0 comments No comments

  2. asma gh 21 Reputation points

    i use a row_number over ( partition by ) and select only how has row_number = 1

    0 comments No comments

  3. Monalv-MSFT 5,896 Reputation points

    Hi @asma gh ,

    May I know if you use ssis package in SSDT?

    If so, we can use Lookup Transformation between Excel Source and OLEDB Destination , and then load NoMatch data into OLEDB Destination.

    Please refer to An Overview of the LOOKUP TRANSFORMATION in SSIS.

    Best regards,


    If the answer is helpful, please click "Accept Answer" and upvote it.

    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.