Create a V2 data factory (SQL On-prem)
This template creates a data factory of version 2 with a pipeline that copies data from a table in an on-premises SQL Server database to a folder of a container in an Azure blob storage.
When you deploy this Azure Resource Manager template, a data factory of version 2 is created with the following entities:
- On-premises SQL Server linked service
- Azure Storage linked service
- On-premises SQL Server input dataset
- Azure Blob output dataset
- Pipeline with a copy activity
Prerequisites
The prerequisites for this template are mentioned in the Tutorial: copy data from on-premises SQL Server database to Azure Blob Storage article.
Next steps
- Click the Deployment succeeded message.
- Click Go to resource group.
- Search for *datafactory that's created.
- Select your data factory to launch the Data Factory page.
- Click Author & Monitor to launch the Data Factory UI application in a separate tab.
- Click Connections at the bottom of the window.
- Switch to the Integration Runtimes window.
- Click the Edit (Pencil icon) for your self-hosted IR.
- Click the Copy button for Key1 to copy the key to the clipboard.
- Install the self-hosted integration runtime by following instructions in this article: Install and register self-hosted IR from download center. Use the key you copied in the previous step to register the integration runtime.
- Now, run and monitor the pipeline by using the steps in the tutorial article.
Tags: Microsoft.DataFactory/factories, linkedservices, AzureStorage, SecureString, integrationRuntimes, SelfHosted, SqlServer, IntegrationRuntimeReference, datasets, SqlServerTable, LinkedServiceReference, AzureBlob, pipelines, Copy, BlobSource, SqlSink, DatasetReference, Microsoft.Storage/storageAccounts