Azure Synapse Link for SQL on Azure VM

John K 186 Reputation points
2022-10-20T05:51:55.45+00:00

As the Dataverse Data Export Service DES will retire in Nov 2022, I am checking how to push data from the Azure Synapse link in datalake to SQL on Azure VM.

I've checked the ADF approach mentioned on the following page.

https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines?tabs=data-factory

It uses the ADF dataflow task, but I cannot write to SQL on Azure VM as dataflow doesn't support using the self-hosted Integration Runtime SHIR, which can copy data to SQL on Azure VM.

Could you advise how we can use ADF to write synapse link data from datalake to SQL on AVM using ADF? Do you have any other template or approach?

Azure SQL Database
Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,459 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,521 questions
Microsoft 365 Publishing
Microsoft 365 Publishing
Microsoft 365: Formerly Office 365, is a line of subscription services offered by Microsoft which adds to and includes the Microsoft Office product line. Publishing: The process of preparing, producing, and releasing content for distribution or sale.
616 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Bhargava-MSFT 30,496 Reputation points Microsoft Employee
    2022-11-10T02:46:46.337+00:00

    Hi @John K ,
    Sorry for the delayed response.
    Here are the two options provided by my internal team on this.

    Option 1
    a.Disconnect the current export job to ADLS
    b.Setup a Synapse workspace
    c.Reconnect the integration with Synapse Link and map it to the synapse workspace ( see the below screenshot )
    d.Use Pipeline Copy Activity to move data from Synapse Serverless SQL (Dynamics Entities) into SQL IaaS using SHIR

    Option 2
    a.Use Data Flows to read from Dataverse and Stage them in an Azure Storage account in CSV/Parquet format
    b.Use Copy Activity to read from the Staged copy and send it to SQL IaaS.

    I hope this helps.

    258927-image.png

    ------------------------------

    • Please don't forget to click on 130616-image.png and upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
    1 person found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.