Hey,
No need to do a full load processing as it would increase the amount of data size (ingress and egress).
Below can be an approach:
- Do incremental loads for updates/inserts
- For Delete , follow the below process:
Get list of distinct business key from the source table (In case if count is less than 5000 meaning more than 5k records would never be deleted in 2 isnatnces of run) then use lookup activity and pass that output to a script activity with sink as the connection to delete those.
If records are greater than 5k, then use copy activity to copy the disinct records within a staging table and then use stored procedure activity that would comapre the records in sink not in staging table and delete those.
This would ensure end to end data integrity is the same pipeline rather than seperate triggers for deletes