Copy Dataverse data into Azure SQL

Jon McCormac 1 Reputation point
2022-09-05T16:01:00.49+00:00

Hi, I am working through a test of this as part of the D365 Data Export Service deprecation and had hit a blocker:

https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines

I have set-up a test Dataverse -> Azure Synapse Link for Dataverse -> Data Lake from Power Platform and the files are exporting as expected into the Data Store directory structure.

When I get to the configure the Trigger however I can't get it to work. By configuring the Blob path ends with: /model.json I get 0 blobs matched as my model.json is in the root directory in the storage.

I can't get past this, has anyone successfully used this new template?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,705 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Binway 696 Reputation points
    2023-04-21T03:39:41.4233333+00:00

    Just adding what I found when I was having the same issue as this seems to have been asked a number of times and I didn't see an appropriate answer. In the instructions for the Dataverse into the Azure SQL at https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines?tabs=synapse-analytics there are the prerequisites - one of which is to enable the incremental folder update - refer to https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-incremental-update. When you set up the link and add table you must enable the incremental. User's image

    The directory structure in your storage account is completely different with the model.json file at the root instead of being held in a folder where the /model.json parameter will work when you create the trigger.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.