Run federated queries on Snowflake

Important

This feature is in Public Preview.

This article describes how to set up Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, see What is Lakehouse Federation.

To connect to your Snowflake database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore:

  • A connection to your Snowflake database.
  • A foreign catalog that mirrors your Snowflake database in Unity Catalog so that you can use Unity Catalog query syntax and data governance tools to manage Azure Databricks user access to the database.

Before you begin

Workspace requirements:

  • Workspace enabled for Unity Catalog.

Compute requirements:

  • Network connectivity from your Databricks Runtime cluster or SQL warehouse to the target database systems. See Networking recommendations for Lakehouse Federation.
  • Azure Databricks clusters must use Databricks Runtime 13.3 LTS or above and shared or single-user access mode.
  • SQL warehouses must be Pro or Serverless.

Permissions required:

  • To create a connection, you must be a metastore admin or a user with the CREATE CONNECTION privilege on the Unity Catalog metastore attached to the workspace.
  • To create a foreign catalog, you must have the CREATE CATALOG permission on the metastore and be either the owner of the connection or have the CREATE FOREIGN CATALOG privilege on the connection.

Additional permission requirements are specified in each task-based section that follows.

  • If you plan to authenticate using single sign-on (SSO), create a security integration in the Snowflake console. See the following section for details.

(Optional) Create a security integration in the Snowflake console

If you want to authenticate using SSO, follow this step before you create a Snowflake connection. To authenticate using a username and password instead, skip this section.

Note

Only Snowflake’s native OAuth integration is supported. External OAuth integrations like Okta or Microsoft Entra ID are not supported.

In the Snowflake console, run CREATE SECURITY INTEGRATION. Replace the following values:

  • <integration-name>: A unique name for your OAuth integration.

  • <workspace-url>: A Azure Databricks workspace URL. You must set OAUTH_REDIRECT_URI to https://<workspace-url>/login/oauth/snowflake.html, where <workspace-url> is the unique URL of the Azure Databricks workspace where you will create the Snowflake connection.

  • <duration-in-seconds>: A time length for refresh tokens.

    Important

    OAUTH_REFRESH_TOKEN_VALIDITY is a custom field that is set to 90 days by default. After the refresh token expires, you must re-authenticate the connection. Set the field to a reasonable time length.

CREATE SECURITY INTEGRATION <integration-name>
TYPE = oauth
ENABLED = true
OAUTH_CLIENT = custom
OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'
OAUTH_REDIRECT_URI = 'https://<workspace-url>/login/oauth/snowflake.html'
OAUTH_ISSUE_REFRESH_TOKENS = TRUE
OAUTH_REFRESH_TOKEN_VALIDITY = <duration-in-seconds>
OAUTH_ENFORCE_PKCE = TRUE;

Create a connection

A connection specifies a path and credentials for accessing an external database system. To create a connection, you can use Catalog Explorer or the CREATE CONNECTION SQL command in an Azure Databricks notebook or the Databricks SQL query editor.

Permissions required: Metastore admin or user with the CREATE CONNECTION privilege.

Catalog explorer

  1. In your Azure Databricks workspace, click Catalog icon Catalog.

  2. In the left pane, expand the External Data menu and select Connections.

  3. Click Create connection.

  4. Enter a user-friendly Connection name.

  5. Select a Connection type of Snowflake.

  6. Enter the following connection properties for your Snowflake warehouse.

    • Auth type: OAuth or Username and password
    • Host: For example, snowflake-demo.east-us-2.azure.snowflakecomputing.com
    • Port: For example, 443
    • Snowflake warehouse: For example, my-snowflake-warehouse
    • User: For example, snowflake-user
    • (OAuth) Client ID: In the Snowflake console, run SELECT SYSTEM$SHOW_OAUTH_CLIENT_SECRETS('<security_integration_name>') to retrieve the client ID for your security integration.
    • (OAuth): Client secret: In the Snowflake console, run SELECT SYSTEM$SHOW_OAUTH_CLIENT_SECRETS('<security_integration_name>') to retrieve the client secret for your security integration.
    • (OAuth) Client scope: refresh_token session:role:<role-name>. Specify the Snowflake role to use in <role-name>.
    • (Username and password) Password: For example, password123

    (OAuth) You are prompted to sign in to Snowflake using your SSO credentials.

  7. (Optional) Click Test connection to confirm that it works.

  8. (Optional) Add a comment.

  9. Click Create.

Sql

Run the following command in a notebook or the Databricks SQL query editor.

CREATE CONNECTION <connection-name> TYPE snowflake
OPTIONS (
  host '<hostname>',
  port '<port>',
  sfWarehouse '<warehouse-name>',
  user '<user>',
  password '<password>'
);

We recommend that you use Azure Databricks secrets instead of plaintext strings for sensitive values like credentials. For example:

CREATE CONNECTION <connection-name> TYPE snowflake
OPTIONS (
  host '<hostname>',
  port '<port>',
  sfWarehouse '<warehouse-name>',
  user secret ('<secret-scope>','<secret-key-user>'),
  password secret ('<secret-scope>','<secret-key-password>')
)

For information about setting up secrets, see Secret management.

Create a foreign catalog

A foreign catalog mirrors a database in an external data system so that you can query and manage access to data in that database using Azure Databricks and Unity Catalog. To create a foreign catalog, you use a connection to the data source that has already been defined.

To create a foreign catalog, you can use Catalog Explorer or the CREATE FOREIGN CATALOG SQL command in an Azure Databricks notebook or the Databricks SQL query editor.

Permissions required: CREATE CATALOG permission on the metastore and either ownership of the connection or the CREATE FOREIGN CATALOG privilege on the connection.

Catalog explorer

  1. In your Azure Databricks workspace, click Catalog icon Catalog.
  2. Click the Create Catalog button.
  3. On the Create a new catalog dialog, enter a name for the catalog and select a Type of Foreign.
  4. Select the Connection that provides access to the database that you want to mirror as a Unity Catalog catalog.
  5. Enter the name of the Database that you want to mirror as a catalog.
  6. Click Create.

Sql

Run the following SQL command in a notebook or Databricks SQL editor. Items in brackets are optional. Replace the placeholder values:

  • <catalog-name>: Name for the catalog in Azure Databricks.
  • <connection-name>: The connection object that specifies the data source, path, and access credentials.
  • <database-name>: Name of the database you want to mirror as a catalog in Azure Databricks.
CREATE FOREIGN CATALOG [IF NOT EXISTS] <catalog-name> USING CONNECTION <connection-name>
OPTIONS (database '<database-name>');

Supported pushdowns

The following pushdowns are supported:

  • Filters
  • Projections
  • Limit
  • Joins
  • Aggregates (Average, Corr, CovPopulation, CovSample, Count, Max, Min, StddevPop, StddevSamp, Sum, VariancePop, VarianceSamp)
  • Functions (String functions, Mathematical functions, Data, Time and Timestamp functions, and other miscellaneous functions, such as Alias, Cast, SortOrder)
  • Windows functions (DenseRank, Rank, RowNumber)
  • Sorting

Data type mappings

When you read from Snowflake to Spark, data types map as follows:

Snowflake type Spark type
decimal, number, numeric DecimalType
bigint, byteint, int, integer, smallint, tinyint IntegerType
float, float4, float8 FloatType
double, double precision, real DoubleType
char, character, string, text, time, varchar StringType
binary BinaryType
boolean BooleanType
date DateType
datetime, timestamp, timestamp_ltz, timestamp_ntz, timestamp_tz TimestampType

OAuth limitations

The following are OAuth support limitations:

  • The Snowflake OAuth endpoint must be accessible from Databricks control plane IPs. See Outbound from Azure Databricks control plane. Snowflake supports configuring network policies at the security integration level, which allows for a separate network policy that enables direct connectivity from the Databricks control plane to the OAuth endpoint for authorization.
  • Use Proxy, Proxy host, Proxy port, and Snowflake role configuration options are not supported. Specify Snowflake role as part of the OAuth scope.

Additional resources