Hi @Anonymous ,
Thank you for posting this query in Microsoft Q&A Platform.
Did you already implemented solution for this requirement and had struck somewhere? If yes, could you please share your current solution details along with screenshots if possible? That way I can help better with issue.
Below, is the solution I would propose to implement.
- Salesforce cloud connector in Synapse Pipelines supports copy activity. So copy data from salesforce cloud to ADLS gen2 as files as it is using Copy activity in Synapse Pipeine.
- From ADLS gens files load data to Synapse SQL tables.
While loading to Synaspe SQL tables we need to identify which rows to load. For that in your case we cannot relay on last updated date column since its not correct one as you mentioned in question. Hence we need to generate hash of every row and compare with hash of sink and load rows which are different. We can do this using dataflows.
Please watch below video that explains same of hashing of rows and comparing with sink.
How to Capture Changed Data using Data flows in Azure Data Factory
Hope this helps. Please let me know if any further queries.
-------------
- Please don't forget to click on
or upvote
button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
- Want a reminder to come back and check responses? Here is how to subscribe to a notification