Resource iteration in ADF ARM templates for DataLakeAnalyticsScope

Unknown-2795 51 Reputation points
2023-05-09T05:10:36.9366667+00:00

I am trying to create an ADF template to deploy a pipeline which will create N DataLakeAnalyticsScope activity based on parameters.

for example the parameters will look like this. In the parameters I have array of scope activity. I want to create multiple scope activities based on number of elements supplied in the array

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "factoryName": {
      "value": "factoryName"
    },
    "adls_ls": { "value": "adls_ls" },
    "adla_ls": { "value": "adla_ls" },
    "scopeActivities": {
      "value": [
        {
          "scriptFolder": "folder1",
          "scriptFile": "file1"
        },
        {
          "scriptFolder": "folder2",
          "scriptFile": "file2"
        }      
      ]
    }
  }
}

The option I found is, to use resource iterator as per this article

As per the article, my arm template should look something like this

{
  "$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "factoryName": {
      "type": "string",
      "metadata": "Data Factory name"
    },
    "adls_ls": { "type": "string" },
    "adla_ls": { "type": "string" },
    "scopeActivities": {
      "type": "array"
    },
    "scriptFolderPath": {
      "type": "string"
    },
    "scriptFileName": {
      "type": "string"
    }
  },
  "variables": {
    "factoryId": "[concat('Microsoft.DataFactory/factories/', parameters('factoryName'))]",
    "pipelineName": "POCPipeline"
  },
  "resources": [
    {
      "name": "[concat(parameters('factoryName'), '/', variables('pipelineName'))]",
      "type": "Microsoft.DataFactory/factories/pipelines",
      "apiVersion": "2018-06-01",
      "properties": {
        "activities": [
          //other activities
        ]
      }
    },
    {
      "name": "[concat(parameters('factoryName'), '/', variables('pipelineName'))],'/',[concat('Scope Activity ',parameters('scopeActivities')[copyIndex()]))]",
      "type": "Microsoft.DataFactory/factories/pipelines/DataLakeAnalyticsScope",
      "dependsOn": [
        "[concat(parameters('factoryName'), '/', variables('pipelineName'))]"
      ],
      "policy": {
        "timeout": "0.12:00:00",
        "retry": 0,
        "retryIntervalInSeconds": 30,
        "secureOutput": false,
        "secureInput": false
      },
      "userProperties": [],
      "typeProperties": {
        "scriptFolderPath": "[parameters('scopeActivities')[copyIndex()].scriptFolder]",
        "scriptFileName": "parameters('scopeActivities')[copyIndex()].scriptFile",
        "scriptLinkedService": {
          "referenceName": "[parameters('adls_ls')]",
          "type": "LinkedServiceReference"
        }
      },
      "linkedServiceName": {
        "referenceName": "[parameters('adls_ls')]",
        "type": "LinkedServiceReference"
      },
      "copy": {
        "name": "blobcopy",
        "count": "[length(parameters('scopeActivities'))]"
      }
    }
  ]
}

But I get this error

New-AzResourceGroupDeployment : 6:19:39 PM - Error: Code=InvalidRequestContent; Message=The request content was invalid and could not be deserialized: 'Could not find member 'policy' on object of type 'TemplateResource'. Path 'properties.template.resources[1].policy', line 72,
position 19.'

I dont know what is wrong with the above template. Can someone please help

Thanks

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,643 questions
{count} votes