Create a Data Factory, Copy from Blob to SQL with Sproc
This template creates a Data Factory pipeline that copies data from a file in a Blob Storage into a SQL Database table while invoking a Stored Procedure (SProc).
Please do the following steps before deploying the template:
- Complete the prerequisites mentioned in Overview and prerequisites article.
- Update values for the following parameters in azuredeploy.parameters.json file.
- storageAccountName
- storageAccountKey
- sqlServerName
- sqlDatabaseName
- sqlUserId
- sqlPassword
- Create a Stored Procedure in your SQL Database. Run the following queries to align with the tutorial.
CREATE PROCEDURE spWriteEmployee READONLY
AS
BEGIN
INSERT INTO [dbo].[emp](First, Last)
VALUES ('Bill', 'Gates')
END
Deploying sample
You can deploy this sample directly through the Azure Portal or by using the scripts supplied in the root of the repository.
To deploy a sample using the Azure Portal, click the Deploy to Azure button at the top of the article.
To deploy the sample via the command line (using Azure PowerShell or the Azure CLI) you can use the scripts.
Simply execute the script from the root folder and pass in the folder name of the sample (101-data-factory-blob-to-sql-copy-stored-proc). For example:
.\Deploy-AzureResourceGroup.ps1 -ResourceGroupLocation 'eastus' -ArtifactStagingDirectory 101-data-factory-blob-to-sql-copy-stored-proc
azure-group-deploy.sh -a 101-data-factory-blob-to-sql-copy-stored-proc -l eastus
Tags: Microsoft.DataFactory/datafactories, linkedservices, AzureStorage, AzureSqlDatabase, datasets, AzureBlob, TextFormat, AzureSqlTable, dataPipelines, Copy, BlobSource, SqlSink, SqlServerStoredProcedure