Data-Factory: Snowflake Linked Service KeyVault

Patrick Doerig 31 Reputation points

Currently the implementation of KeyVault inside the linked service template for Snowflake is not properly done.

  1. Even when keyVault is referenced for the password I am getting the following message:


  1. It would be helpful if the the username, warehouse as well as the database can also be stored in keyVault and not only the password as this makes it difficult to work with multiple environments and devops.
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,159 questions
{count} votes

Accepted answer
  1. KranthiPakala-MSFT 46,442 Reputation points Microsoft Employee

    Hi @Patrick Doerig ,

    Apologizes for the delay in response. After having further analysis, it is determined that when Password is used in Azure Key Valut (AKV) [Azure SQL linked service, Snowflake linked service, etc., ] then we see the warning and the linked service is immediately published to Data factory service.

    But when Connection string is used in AKV (which is not currently supported in Snowflake UI, supports for Azure SQL linked service, etc.,) from linked service UI, then we don't see the warning message and the linked service is not published immediately. Please see below image for further clarification.


    In order to avoid use of Database and Warehouse values in linked service UI, as a workaround you can save the complete connection string (including accountname, username, password, database, warehouse details) in Azure Key Vault (sample connection string format: jdbc:snowflake://<accountname><username>&password=<password>&db=<database>&warehouse=<warehouse>)

    Then create a Snowflake linked service by using below JSON format file from powershell as described in this document: Create a ADF linked service using powershell Or you can create a dummy snowflake linked service from UI and then you update the JSON code with below JSON code to use connection string from AKV. I have tested this and working as expected.

           "name": "<YouSnowflakeLinkedServiceName>",  
           "type": "Microsoft.DataFactory/factories/linkedservices",  
           "properties": {  
               "annotations": [],  
               "type": "Snowflake",  
               "typeProperties": {  
                   "connectionString": {  
                       "type": "AzureKeyVaultSecret",  
                       "store": {  
                           "referenceName": "<YourAzureKeyVaultLinkedServiceName>",  
                           "type": "LinkedServiceReference"  
                       "secretName": "<YourSnowflakeConnectionstringSecretName>",  
                       "secretVersion": "<YourSnowflakeConnectionstringSecretVersion>"  

    One thing you will notice is that the linked service properties will not populate any values in UI when you using Connection string from AKV. The linked service looks as below (with all fields empty). Even you won't be able to test the connection as it is grayed out (which is weird). But in order to test this linked service, you can create a dataset and use this linked service to import/preview data. I have tested this and it is working.


    I agree that this is not an appropriate way to achieve this but this is the only possible way/workaround to use complete connection string from AKV for snowflake linked service. I have provided a feedback to the ADF engineering team to include native support of Connection String in AKV from UI and it is under review.

    I would also request you to please share your feedback in ADF user-voice forum and do share feedback link here with the community so that other users can up-vote and comment on your feedback to prioritize the feature request implementation.

    Hope this helps. Please feel free to let me know if you have any further query.


    Thank you
    Please do consider to click on "Accept Answer" and "Upvote" on the post that helps you, as it can be beneficial to other community members.

    1 person found this answer helpful.

2 additional answers

Sort by: Most helpful
  1. KranthiPakala-MSFT 46,442 Reputation points Microsoft Employee

    Hi @PatrickDoerig-3768,

    Thanks for the confirmation. When a pipeline is working by trigger, but not by debug, that suggests either: there is a difference between the published version and the version in the UI or there are parameters that depend upon the trigger. Have you created the Snowflake linked service first with hard coded values and then edited the JSON payload with connection string AKV? If that is the case, then, when the Snowflake linked service was created initially with hard coded values (no AKV) then the linked service is directly published to data factory service, and when you have edited the JSON payload of the linked service with AKV details, then the edited linked service is not immediately published to the data factory service, it has to be merged to master and then to be published from master to data factory service (nothing but adf_publish). This could be the case, where there is a mismatch between published version and local brach/UI version.

    And since the linked service haven't been published yet, debug run would take all linked service resources payload fetched from UI, while trigger run would take all published resources in Azure. Debug run always takes the newest resources user edits on UI, after they publish resources, UI would keep consistency with published resource, everything should be fine.

    Are you noticing the same error when you published the linked service and do a debug run? - Could you please confirm on this?

    Thank you

  2. KranthiPakala-MSFT 46,442 Reputation points Microsoft Employee

    Hi @Patrick Doerig ,

    Just wanted to update you that ADF Product team has successfully implemented the feature request you have shared to support AKV for whole connection string in Snowflake Database. Please feel free to use it and let us know if you see any issues.


    Thank you.