Copy Dataverse data from ADLS Gen2 to Azure SQL DB leveraging Azure Synapse Link template in Synapse not working for multiple entities

CHARAN TEJA NALLURI 6 Reputation points
2023-03-27T12:04:42.96+00:00

Hi,

 While using a Template related to Copy data from Dataverse to SQL via Azure Synapse link, templates while configuring for multiple entities are not working.

Issues I was facing

  1. After configuring Dataflows, pipelines are working successfully, but data not moving into SQL
  2. If SQL Server has a limit of 8060 per row size, what will be another option to migrate data if we have more columns in the table?
  3. if I am using Temporal tables with System versioning on to track the changes for each row but while using Microsoft Template they have given Upsert whenever triggered while using upsert it will create new rows every time whenever a new record is created it will insert the record and update other records that time it will create new rows every time in the History table.
  4. I feel no other logic related to Updating records by using that template without using upsert (without falling under the above error)while using more entities. if there can you suggest the logic?
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,917 questions
{count} votes

1 answer

Sort by: Most helpful
  1. AnnuKumari-MSFT 32,906 Reputation points Microsoft Employee
    2023-03-28T10:23:24.94+00:00

    @CHARAN TEJA NALLURI ,

    Welcome to Microsoft Q&A platform and thanks for posting your question here.

    As I understand your query, you are trying to copy data from dataverse to SQL server using Synapse link template, however, data is not getting copied to the destination, kindly let me know if that is not the case.

    • I am assuming that you are not receiving any errors while executing the pipeline, however, data is not getting copied. It might be possible that the data is not moving into SQL because the data types in Dataverse and SQL Server are not compatible. You can check the data types of the columns in both Dataverse and SQL Server and ensure that they are compatible.
    • Coming to SQL Server limit of 8060 per row size, you can consider splitting the table into multiple tables or using vertical partitioning to store some of the columns in a separate table.
    • Regarding your next query, if you are using temporal tables with system versioning on to track changes, you can use the MERGE statement to update the records in the history table.
    • In case you are using multiple entities and want to update records without using upsert, you can use the UPDATE statement to update the records in the SQL Server table. However, you will need to ensure that the primary key of the table is included in the data that you are copying from Dataverse to SQL Server.

    You can find additional details here: Copy Dataverse data into Azure SQL


    Hope it helps. Kindly accept the answer by clicking on Accept answer button.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.