Secret workflow example
In this workflow example, we use secrets to set up JDBC credentials for connecting to an Azure Data Lake Store.
Create a secret scope
Create a secret scope called jdbc
.
To create a Databricks-backed secret scope:
databricks secrets create-scope jdbc
To create an Azure Key Vault-backed secret scope, follow the instructions in Create an Azure Key Vault-backed secret scope.
Note
If your account does not have the Premium plan, you must create the scope with MANAGE permission granted to all users (“users”). For example:
databricks secrets create-scope jdbc --initial-manage-principal users
Create secrets
The method for creating the secrets depends on whether you are using an Azure Key Vault-backed scope or a Databricks-backed scope.
Create the secrets in an Azure Key Vault-backed scope
Add the secrets username
and password
using the Azure Set Secret REST API or Azure portal UI:
Create the secrets in a Databricks-backed scope
Add the secrets username
and password
. Run the following commands and enter the secret values in the opened editor.
databricks secrets put-secret jdbc username
databricks secrets put-secret jdbc password
Use the secrets in a notebook
Use the dbutils.secrets
utility to access secrets in notebooks.
The following example reads the secrets that are stored in the secret scope jdbc
to configure a JDBC read operation:
Python
username = dbutils.secrets.get(scope = "jdbc", key = "username")
password = dbutils.secrets.get(scope = "jdbc", key = "password")
df = (spark.read
.format("jdbc")
.option("url", "<jdbc-url>")
.option("dbtable", "<table-name>")
.option("user", username)
.option("password", password)
.load()
)
Scala
val username = dbutils.secrets.get(scope = "jdbc", key = "username")
val password = dbutils.secrets.get(scope = "jdbc", key = "password")
val df = spark.read
.format("jdbc")
.option("url", "<jdbc-url>")
.option("dbtable", "<table-name>")
.option("user", username)
.option("password", password)
.load()
The values fetched from the scope are redacted from the notebook output. See Secret redaction.
Grant access to another group
Note
This step requires that your account have the Premium plan.
After verifying that the credentials were configured correctly, share these credentials with the datascience
group to use for their analysis by granting them permissions to read the secret scope and list the available secrets .
Grant the datascience
group the READ permission to these credentials by making the following request:
databricks secrets put-acl jdbc datascience READ
For more information about secret access control, see Secret ACLs.