Issue with copy data from Azure Synapse Link for Dataverse using ADF Dataflow

Raheel Islam 21 Reputation points
2024-07-01T11:07:06.5533333+00:00

Hi there,

We use Export to Data Lake option to copy the data from D365 and want to transform to Azure Synapse Link for Dataverse because Export to Data Lake option is obsoleting from 1st Nov 2024.

We have configured Azure Synapse link for Dataverse in UAT environment as per below screenshot. User's image

It has created the new container with different timestamp folders.

User's image

Above folders contain tables and data information.

User's image

I have created a data pipeline in ADF to get the custtable data using dataflow and give the path till highlighted in above screenshot. It works fine till here but as I make changes in D365, a new folder with different timestamp created with the data and modify rows only as can see in below screenshot.

User's image

I want to get custtable or any specific entity data using ADF Dataflow but here the issue is I need to define the full path till timestamp and this timestamp folder create whenever any update happen in the entity, so this is not the fix path. How I get the custtable data everyday but how to know the path?

here is the Dataflow code:

User's image

User's image

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,478 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,801 questions
{count} votes

1 answer

Sort by: Most helpful
  1. AnnuKumari-MSFT 33,151 Reputation points Microsoft Employee
    2024-07-03T17:48:41.4633333+00:00

    Hi Raheel Islam ,

    Thankyou for posting your query on Microsoft Q&A platform.

    It seems like you are facing an issue with getting the updated data from D365 to Azure Synapse Link for Dataverse using ADF Dataflow. As you mentioned, a new folder with a different timestamp is created whenever there is an update in the entity, and you need to define the full path till the timestamp to get the updated data.

    To get the updated data every day, you can use a dynamic path by using suitable expression to fetch the latest folder instead of hardcoding the path. Additionally, instead of relying on the foldername to get the timestamp, I would suggest to rely on the 'lastmodified' timestamp which can be fetched using Get Metadata activity in ADF pipeline.

    One of the similar scenario has been covered in this video : How to load latest and greatest data

    Kindly see if you can implement similar approach in your case to fetch the latest folder.

    Hope it helps. Please do let us know how it goes. Thankyou

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.