How to copy D365 data into Azure SQL db
Hello,
This is going to be long winded so please bear with me. For the past year I've been extracting data out of D365 using the basic set up of a Synapse Link with incremental folder updates in Power Apps for the dumping of data in a storage account and subsequently using Data Factory with the Microsoft template Copy Dataverse data into Azure SQL using Synapse Link. This has all worked fine, however I am now trying to get the auditing out of Dynamics365, so I've set up a separate Synapse Link and connect to a Synapse Analytics Workspace and uses an Apache Spark Pool for Delta Lake data conversion. This works fine, however I can't use the same template in Data Factory to get the data written into an Azure SQL db. After loads of testing, I've discovered that the model.json file that gets produced in the storage account and that triggers the Data Factory pipeline, is different to the one produced in my regular Synapse Link.
The regular Synapse Link model.json file specifies the partitioning of an entity
The new Synapse Link for auditing does not specify this
The Data Factory pipeline (specifically the DataFlow that actually does the upsert/delete etc.) relies on partitioning being specified in the model.json file.
The screenshots above show that when it comes to processing the data, there is an if condition that will run the actual data flow if there are partitions mentioned in the model.json file. Since this attribute is missing in the Synapse Link for auditing (the entity is partitioned by year), the if returns false and doesn't run the dataflow. If I force it to run the dataflow on false, it runs it successfully, but no data is inserted.
I can't find any information on how to get this data into an Azure SQL db, is there anything in the Microsoft Data Factory template that I need to change? Is there another template or a different solution that needs to be used?
Thank you,
Diana