Encrypt Azure Data Factory with customer-managed keys

Akash Verma 31 Reputation points
2020-10-02T11:24:51.627+00:00

I want to encrypt Azure Data Factory with customer-managed keys. . I have gone through this link https://learn.microsoft.com/en-us/azure/data-factory/enable-customer-managed-key
It shows everything via console/gui. Is there any powershell/Azure CLI/API for the same to implement encryption on Azure Data Factory with customer-managed keys.

Azure R Server for HDInsight
Azure R Server for HDInsight
An Azure service that provides predictive analytics, machine learning, and statistical modeling for big data.
13 questions
{count} vote

5 answers

Sort by: Most helpful
  1. Akash Verma 31 Reputation points
    2020-10-07T09:11:59.527+00:00

    Hello @MartinJaffer-MSFT , can you please confirm on the query raised regarding programmatic way encrypt Azure Data Factory with customer-managed keys.

    0 comments No comments

  2. MartinJaffer-MSFT 26,106 Reputation points
    2020-10-09T08:37:39.26+00:00

    @Akash Verma thank you for your patience. It is possible via REST API, but is not yet included in the documentation.

    To update the property, update the data factory via patch . Update REST Call. To change only the encryption, the body should look like the following:

       {"properties":{  
           "encryption": {  
             "VaultBaseUrl": "https://MyVaultName.vault.azure.net",  
             "KeyName": "MyKeyName",  
             "KeyVersion": "XXXXXXXXXXXXXXXXXXXXX"  
           }  
       } }  
    

    I found this by initially setting up the managed eky encryption in the UI, then using the REST API to get the factory. Abbreviated, the output looked like the following:

       {  
         "name": "MyFactoryName",  
         "id": "/subscriptions/XXXXXXXXXXXXXXXXXX/resourceGroups/MyRG/providers/Microsoft.DataFactory/factories/myfactoryname",  
         "type": "Microsoft.DataFactory/factories",  
         "properties": {  
           "provisioningState": "Succeeded",  
           "createTime": "2020-10-09T07:10:06.7688617Z",  
           "version": "2018-06-01",  
           "factoryStatistics": {  
             "totalResourceCount": 0,  
             "maxAllowedResourceCount": 0,  
             "factorySizeInGbUnits": 0,  
             "maxAllowedFactorySizeInGbUnits": 0  
           },  
           "encryption": {  
             "VaultBaseUrl": "https://MyKeyVaultName.vault.azure.net",  
             "KeyName": "CMK",  
             "KeyVersion": "XXXXXXXXXXXXXXXXXXXXXXX"  
           }  
         },  
         "eTag": ...  
    
    0 comments No comments

  3. 2020-10-09T13:18:40.123+00:00

    Hey guys,

    @MartinJaffer-MSFT , is it possible to do it via powershell/ az cli?. I currently have my adfs set up via terraforms and i would like to automatize this process.

    Thanks in advance.

    0 comments No comments

  4. MartinJaffer-MSFT 26,106 Reputation points
    2020-10-13T22:41:11.863+00:00

    @Nascimento, Everton (EXT - PT/Amadora) sorry for missing your ping.
    It is not yet implemented in the powershell / SDK / az cli. I did hear they plan to make it available later this year, but no fixed date.

    That said, I think there may be a work-around.
    I remember a previous issue long ago where I had trouble deploying a certain dataset using the Data Factory powershell commands. The work-around was to instead deploy using New-AzResource or Set-AzResource. I think it should be possible to do the same here.
    In the past instance, I supplied the dataset definition in a file. I believe in this case, what we want to deploy should look the same as for the REST api.

    It is possible to get details of a Factory via
    ConvertTo-Json(Get-AzResource -ResourceGroupName "MyGroup" -ResourceName "MyFactory" -ResourceType "Microsoft.DataFactory/factories")
    or
    (Get-AzResource -ResourceGroupName "MyGroup" -ResourceName "MyFactory" -ResourceType "Microsoft.DataFactory/factories").properties


  5. MartinJaffer-MSFT 26,106 Reputation points
    2020-10-22T23:38:32.507+00:00

    @Nascimento, Everton (EXT - PT/Amadora) Azure is a very large, complex system. The services, assets, and resources you can deploy are all very different, like visiting a zoo and seeing all the different animals. To describe them, there is a generalized "resource" class in Azure. Data Factory is a specialized version of a resource. This is like an elephant is a type of animal.

    To create/update an Azure resource, there is the Set-AzResource command. Given how all the Azure resource types have different, independent properties, Set-AzResource does not attempt to validate the properties particular to any given resource type. It only validates those properties common to all types. The particulars are passed as part of the payload. This is similar to how the details of a pipeline can be set by Set-AzDataFactoryV2Pipeline.

    Set-AzDataFactoryV2 works differently from Set-AzDataFactoryV2Pipeline. Whereas Set-AzDataFactoryV2Pipeline expects you to supply the payload (definition file), Set-AzDataFactoryV2 asks you for the details, and generates a payload (definition file) from this, and sends that to Azure.

    Since the encryption property is missing from the Set-AzDataFactoryV2 commandlet, it does not know how to write that to the definition file. I am proposing to explicitly create the definition file, and use the generic resource deployer, Set-AzResource, to write the factory the same way Set-AzDataFactoryV2Pipeline writes a pipeline. (It is also possible to write a pipeline using Set-AzResource).

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.