Events
Mar 31, 11 PM - Apr 2, 11 PM
The biggest Fabric, Power BI, and SQL learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
APPLIES TO:
Azure Data Factory
Azure Synapse Analytics
Tip
Try out Data Factory in Microsoft Fabric, an all-in-one analytics solution for enterprises. Microsoft Fabric covers everything from data movement to data science, real-time analytics, business intelligence, and reporting. Learn how to start a new trial for free!
This article describes how to run a SQL Server Integration Services (SSIS) package in an Azure Data Factory pipeline using PowerShell and the Execute SSIS Package activity.
Note
We recommend that you use the Azure Az PowerShell module to interact with Azure. To get started, see Install Azure PowerShell. To learn how to migrate to the Az PowerShell module, see Migrate Azure PowerShell from AzureRM to Az.
Create an Azure-SSIS integration runtime (IR) if you don't have one already by following the step-by-step instructions in the Tutorial: Provisioning Azure-SSIS IR.
You can either use an existing data factory that already has Azure-SSIS IR provisioned or create a new data factory with Azure-SSIS IR. Follow the step-by-step instructions in the Tutorial: Deploy SSIS packages to Azure via PowerShell.
In this step, you create a pipeline with an Execute SSIS Package activity. The activity runs your SSIS package.
Create a JSON file named RunSSISPackagePipeline.json
in the C:\ADF\RunSSISPackage
folder with content similar to the following example.
Important
Replace object names, descriptions, and paths, property or parameter values, passwords, and other variable values before you save the file.
{
"name": "RunSSISPackagePipeline",
"properties": {
"activities": [{
"name": "MySSISActivity",
"description": "My SSIS package/activity description",
"type": "ExecuteSSISPackage",
"typeProperties": {
"connectVia": {
"referenceName": "MyAzureSSISIR",
"type": "IntegrationRuntimeReference"
},
"executionCredential": {
"domain": "MyExecutionDomain",
"username": "MyExecutionUsername",
"password": {
"type": "SecureString",
"value": "MyExecutionPassword"
}
},
"runtime": "x64",
"loggingLevel": "Basic",
"packageLocation": {
"type": "SSISDB",
"packagePath": "MyFolder/MyProject/MyPackage.dtsx"
},
"environmentPath": "MyFolder/MyEnvironment",
"projectParameters": {
"project_param_1": {
"value": "123"
},
"project_param_2": {
"value": {
"value": "@pipeline().parameters.MyProjectParameter",
"type": "Expression"
}
}
},
"packageParameters": {
"package_param_1": {
"value": "345"
},
"package_param_2": {
"value": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "myAKV",
"type": "LinkedServiceReference"
},
"secretName": "MyPackageParameter"
}
}
},
"projectConnectionManagers": {
"MyAdonetCM": {
"username": {
"value": "MyConnectionUsername"
},
"password": {
"value": {
"type": "SecureString",
"value": "MyConnectionPassword"
}
}
}
},
"packageConnectionManagers": {
"MyOledbCM": {
"username": {
"value": {
"value": "@pipeline().parameters.MyConnectionUsername",
"type": "Expression"
}
},
"password": {
"value": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "myAKV",
"type": "LinkedServiceReference"
},
"secretName": "MyConnectionPassword",
"secretVersion": "MyConnectionPasswordVersion"
}
}
}
},
"propertyOverrides": {
"\\Package.MaxConcurrentExecutables": {
"value": 8,
"isSensitive": false
}
}
},
"policy": {
"timeout": "0.01:00:00",
"retry": 0,
"retryIntervalInSeconds": 30
}
}]
}
}
To execute packages stored in file system/Azure Files, enter the values for your package and log location properties as follows:
{
{
{
{
"packageLocation": {
"type": "File",
"packagePath": "//MyStorageAccount.file.core.windows.net/MyFileShare/MyPackage.dtsx",
"typeProperties": {
"packagePassword": {
"type": "SecureString",
"value": "MyEncryptionPassword"
},
"accessCredential": {
"domain": "Azure",
"username": "MyStorageAccount",
"password": {
"type": "SecureString",
"value": "MyAccountKey"
}
}
}
},
"logLocation": {
"type": "File",
"logPath": "//MyStorageAccount.file.core.windows.net/MyFileShare/MyLogFolder",
"typeProperties": {
"accessCredential": {
"domain": "Azure",
"username": "MyStorageAccount",
"password": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "myAKV",
"type": "LinkedServiceReference"
},
"secretName": "MyAccountKey"
}
}
}
}
}
}
}
}
To execute packages within projects stored in file system/Azure Files, enter the values for your package location properties as follows:
{
{
{
{
"packageLocation": {
"type": "File",
"packagePath": "//MyStorageAccount.file.core.windows.net/MyFileShare/MyProject.ispac:MyPackage.dtsx",
"typeProperties": {
"packagePassword": {
"type": "SecureString",
"value": "MyEncryptionPassword"
},
"accessCredential": {
"domain": "Azure",
"userName": "MyStorageAccount",
"password": {
"type": "SecureString",
"value": "MyAccountKey"
}
}
}
}
}
}
}
}
To execute embedded packages, enter the values for your package location properties as follows:
{
{
{
{
"packageLocation": {
"type": "InlinePackage",
"typeProperties": {
"packagePassword": {
"type": "SecureString",
"value": "MyEncryptionPassword"
},
"packageName": "MyPackage.dtsx",
"packageContent":"My compressed/uncompressed package content",
"packageLastModifiedDate": "YYYY-MM-DDTHH:MM:SSZ UTC-/+HH:MM"
}
}
}
}
}
}
To execute packages stored in package stores, enter the values for your package and configuration location properties as follows:
{
{
{
{
"packageLocation": {
"type": "PackageStore",
"packagePath": "myPackageStore/MyFolder/MyPackage",
"typeProperties": {
"packagePassword": {
"type": "SecureString",
"value": "MyEncryptionPassword"
},
"accessCredential": {
"domain": "Azure",
"username": "MyStorageAccount",
"password": {
"type": "SecureString",
"value": "MyAccountKey"
}
},
"configurationPath": "//MyStorageAccount.file.core.windows.net/MyFileShare/MyConfiguration.dtsConfig",
"configurationAccessCredential": {
"domain": "Azure",
"userName": "MyStorageAccount",
"password": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "myAKV",
"type": "LinkedServiceReference"
},
"secretName": "MyAccountKey"
}
}
}
}
}
}
}
}
In Azure PowerShell, switch to the C:\ADF\RunSSISPackage
folder.
To create the pipeline RunSSISPackagePipeline, run the Set-AzDataFactoryV2Pipeline cmdlet.
$DFPipeLine = Set-AzDataFactoryV2Pipeline -DataFactoryName $DataFactory.DataFactoryName `
-ResourceGroupName $ResGrp.ResourceGroupName `
-Name "RunSSISPackagePipeline"
-DefinitionFile ".\RunSSISPackagePipeline.json"
Here's the sample output:
PipelineName : Adfv2QuickStartPipeline
ResourceGroupName : <resourceGroupName>
DataFactoryName : <dataFactoryName>
Activities : {CopyFromBlobToBlob}
Parameters : {[inputPath, Microsoft.Azure.Management.DataFactory.Models.ParameterSpecification], [outputPath, Microsoft.Azure.Management.DataFactory.Models.ParameterSpecification]}
Use the Invoke-AzDataFactoryV2Pipeline cmdlet to run the pipeline. The cmdlet returns the pipeline run ID for future monitoring.
$RunId = Invoke-AzDataFactoryV2Pipeline -DataFactoryName $DataFactory.DataFactoryName `
-ResourceGroupName $ResGrp.ResourceGroupName `
-PipelineName $DFPipeLine.Name
Run the following PowerShell script to continuously check the pipeline run status until it finishes copying the data. Copy or paste the following script in the PowerShell window, and select Enter.
while ($True) {
$Run = Get-AzDataFactoryV2PipelineRun -ResourceGroupName $ResGrp.ResourceGroupName `
-DataFactoryName $DataFactory.DataFactoryName `
-PipelineRunId $RunId
if ($Run) {
if ($run.Status -ne 'InProgress') {
Write-Output ("Pipeline run finished. The status is: " + $Run.Status)
$Run
break
}
Write-Output "Pipeline is running...status: InProgress"
}
Start-Sleep -Seconds 10
}
You can also monitor the pipeline by using the Azure portal. For step-by-step instructions, see Monitor the pipeline.
In the previous step, you ran the pipeline on demand. You can also create a schedule trigger to run the pipeline on a schedule, such as hourly or daily.
Create a JSON file named MyTrigger.json
in the C:\ADF\RunSSISPackage
folder with the following content:
{
"properties": {
"name": "MyTrigger",
"type": "ScheduleTrigger",
"typeProperties": {
"recurrence": {
"frequency": "Hour",
"interval": 1,
"startTime": "2017-12-07T00:00:00-08:00",
"endTime": "2017-12-08T00:00:00-08:00"
}
},
"pipelines": [{
"pipelineReference": {
"type": "PipelineReference",
"referenceName": "RunSSISPackagePipeline"
},
"parameters": {}
}]
}
}
In Azure PowerShell, switch to the C:\ADF\RunSSISPackage
folder.
Run the Set-AzDataFactoryV2Trigger cmdlet, which creates the trigger.
Set-AzDataFactoryV2Trigger -ResourceGroupName $ResGrp.ResourceGroupName `
-DataFactoryName $DataFactory.DataFactoryName `
-Name "MyTrigger" -DefinitionFile ".\MyTrigger.json"
By default, the trigger is in stopped state. Start the trigger by running the Start-AzDataFactoryV2Trigger cmdlet.
Start-AzDataFactoryV2Trigger -ResourceGroupName $ResGrp.ResourceGroupName `
-DataFactoryName $DataFactory.DataFactoryName `
-Name "MyTrigger"
Confirm that the trigger is started by running the Get-AzDataFactoryV2Trigger cmdlet.
Get-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName `
-DataFactoryName $DataFactoryName `
-Name "MyTrigger"
Run the following command after the next hour. For example, if the current time is 3:25 PM UTC, run the command at 4 PM UTC.
Get-AzDataFactoryV2TriggerRun -ResourceGroupName $ResourceGroupName `
-DataFactoryName $DataFactoryName `
-TriggerName "MyTrigger" `
-TriggerRunStartedAfter "2017-12-06" `
-TriggerRunStartedBefore "2017-12-09"
Run the following query against the SSISDB database in your SQL server to verify that the package executed.
select * from catalog.executions
Events
Mar 31, 11 PM - Apr 2, 11 PM
The biggest Fabric, Power BI, and SQL learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayTraining
Module
Execute existing SSIS packages in Azure Data Factory - Training
Execute existing SSIS packages in Azure Data Factory or Azure Synapse Pipeline
Certification
Microsoft Certified: Azure Data Engineer Associate - Certifications
Demonstrate understanding of common data engineering tasks to implement and manage data engineering workloads on Microsoft Azure, using a number of Azure services.
Documentation
Learn how to execute SQL Server Integration Services (SSIS) packages with the Azure-enabled dtexec utility.
Run SSIS package with Stored Procedure Activity - Azure - Azure Data Factory
This article describes how to run a SQL Server Integration Services (SSIS) package in an Azure Data Factory pipeline by using the Stored Procedure Activity.
Manage packages with Azure-SSIS Integration Runtime package store - Azure Data Factory
Learn how to manage packages with Azure-SSIS Integration Runtime package store.