Utilizing Azure Key Vault secrets for secure password management in Apache spark connector for SQL Server

Thomasville99 280 Reputation points

In the provided code example, our team demonstrates using a hard-coded username and password for connecting to SQL Server using the Apache Spark connector. However, for enhanced security, I want to utilize a password stored in Azure Key Vault instead of hard-coding it.

Question: How can I modify the code to use a secret (password) stored in Azure Key Vault, such as mySQLSecret with a value of myPassword?

server_name = 
Azure SQL Database
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,484 questions
{count} votes

Accepted answer
  1. Smaran Thoomu 10,880 Reputation points Microsoft Vendor

    Hi @Thomasville99

    Thanks for the question and using MS Q&A platform.

    If you are using Databricks, the easiest method to retrieve a password secret from Azure Key Vault is to connect Azure Key Vault to Databricks as a secret scope, as explained here: https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes#--create-an-azure-key-vault-backed-secret-scope.

    Once you have created an Azure Key Vault-backed secret scope, you can retrieve a password secret from it using the dbutils.secrets.get function. This function needs two things: the name of the secret scope you used when you created the secret scope, and the name of the secret you want to retrieve. Here's an example of how you can modify the code to retrieve the password from Azure Key Vault using dbutils.secrets.get:

    # Replace <your-secret-scope-name> with the name of your secret scope
    secret_scope_name = "<your-secret-scope-name>"
    # Replace mySQLSecret with the name of your secret in Azure Key Vault
    password_secret_name = "mySQLSecret"
    # Retrieve the password secret from Azure Key Vault
    password = dbutils.secrets.get(scope=secret_scope_name, key=password_secret_name)
    server_name = "jdbc:sqlserver://{SERVER_ADDR}"
    database_name = "database_name"
    url = server_name + ";" + "databaseName=" + database_name + ";"
    table_name = "table_name"
    username = "username"
      df.write \
        .format("com.microsoft.sqlserver.jdbc.spark") \

    Hope this helps. Do let us know if you any further queries.

    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful