Κοινή χρήση μέσω


What is Lakehouse Federation?

This article introduces Lakehouse Federation, the query federation platform that enables you to use Azure Databricks to run queries against multiple external data sources. It also describes how to set up Lakehouse Federation connections and create foreign catalogs in your Unity Catalog metastore.

What is Lakehouse Federation?

Lakehouse Federation is the query federation platform for Azure Databricks. The term query federation describes a collection of features that enable users and systems to run queries against multiple data sources without needing to migrate all data to a unified system.

Azure Databricks uses Unity Catalog to manage query federation. You configure read-only connections to popular database solutions using drivers that are included on pro SQL warehouses, serverless SQL warehouses, and Databricks Runtime clusters. Unity Catalog’s data governance and data lineage tools ensure that data access is managed and audited for all federated queries made by the users in your Azure Databricks workspaces.

Why use Lakehouse Federation?

The lakehouse emphasizes central storage of data to reduce data redundancy and isolation. Your organization might have numerous data systems in production, and you might want to query data in connected systems for a number of reasons:

  • Ad hoc reporting.
  • Proof-of-concept work.
  • The exploratory phase of new ETL pipelines or reports.
  • Supporting workloads during incremental migration.

In each of these scenarios, query federation gets you to insights faster, because you can query the data in place and avoid complex and time-consuming ETL processing.

Lakehouse Federation is meant for use cases when:

  • You don’t want to ingest data into Azure Databricks.
  • You want your queries to take advantage of compute in the external database system.
  • You want the advantages of Unity Catalog interfaces and data governance, including fine-grained access control, data lineage, and search.

Overview of Lakehouse Federation setup

To make a dataset available for read-only querying using Lakehouse Federation, you create the following:

  • A connection, a securable object in Unity Catalog that specifies a path and credentials for accessing an external database system.
  • A foreign catalog, a securable object in Unity Catalog that mirrors a database in an external data system, enabling you to perform read-only queries on that data system in your Azure Databricks workspace, managing access using Unity Catalog.

Supported data sources

Lakehouse Federation supports connections to the following database types:

Connection requirements

Workspace requirements:

  • Workspace enabled for Unity Catalog.

Compute requirements:

  • Network connectivity from your compute resource to the target database systems. See Networking recommendations for Lakehouse Federation.
  • Azure Databricks compute must use Databricks Runtime 13.3 LTS or above and Shared or Single user access mode.
  • SQL warehouses must be pro or serverless and must use 2023.40 or above.

Permissions required:

  • To create a connection, you must be a metastore admin or a user with the CREATE CONNECTION privilege on the Unity Catalog metastore attached to the workspace.
  • To create a foreign catalog, you must have the CREATE CATALOG permission on the metastore and be either the owner of the connection or have the CREATE FOREIGN CATALOG privilege on the connection.

Additional permission requirements are specified in each task-based section that follows.

Create a connection

A connection specifies a path and credentials for accessing an external database system. To create a connection, you can use Catalog Explorer or the CREATE CONNECTION SQL command in an Azure Databricks notebook or the Databricks SQL query editor.

Note

You can also use the Databricks REST API or the Databricks CLI to create a connection. See POST /api/2.1/unity-catalog/connections and Unity Catalog commands.

Permissions required: Metastore admin or user with the CREATE CONNECTION privilege.

Catalog Explorer

  1. In your Azure Databricks workspace, click Catalog icon Catalog.

  2. At the top of the Catalog pane, click the Add or plus icon Add icon and select Add a connection from the menu.

    Alternatively, from the Quick access page, click the External data > button, go to the Connections tab, and click Create connection.

  3. Enter a user-friendly Connection name.

  4. Select the Connection type (database provider, like MySQL or PostgreSQL).

  5. Enter the connection properties (such as host information, path, and access credentials).

    Each connection type requires different connection information. See the article for your connection type, listed in the table of contents to the left.

  6. (Optional) Click Test connection to confirm that it works.

  7. (Optional) Add a comment.

  8. Click Create.

SQL

Run the following command in a notebook or the SQL query editor. This example is for connections to a PostgreSQL database. The options differ by connection type. See the article for your connection type, listed in the table of contents to the left.

CREATE CONNECTION <connection-name> TYPE postgresql
OPTIONS (
  host '<hostname>',
  port '<port>',
  user '<user>',
  password '<password>'
);

We recommend that you use Azure Databricks secrets instead of plaintext strings for sensitive values like credentials. For example:

CREATE CONNECTION <connection-name> TYPE postgresql
OPTIONS (
  host '<hostname>',
  port '<port>',
  user secret ('<secret-scope>','<secret-key-user>'),
  password secret ('<secret-scope>','<secret-key-password>')
)

For information about setting up secrets, see Secret management.

For information about managing existing connections, see Manage connections for Lakehouse Federation.

Create a foreign catalog

A foreign catalog mirrors a database in an external data system so that you can query and manage access to data in that database using Azure Databricks and Unity Catalog. To create a foreign catalog, you use a connection to the data source that has already been defined.

To create a foreign catalog, can use Catalog Explorer or the CREATE FOREIGN CATALOG SQL command in an Azure Databricks notebook or the SQL query editor.

Note

You can also use the Unity Catalog API. See Azure Databricks reference documentation.

Foreign catalog metadata is synced into Unity Catalog on each interaction with the catalog. For the data type mapping between Unity Catalog and the data source, check the Data Type Mappings section of each data source’s documentation.

Permissions required: CREATE CATALOG permission on the metastore and either ownership of the connection or the CREATE FOREIGN CATALOG privilege on the connection.

Catalog Explorer

  1. In your Azure Databricks workspace, click Catalog icon Catalog to open Catalog Explorer.

  2. At the top of the Catalog pane, click the Add or plus icon Add icon and select Add a catalog from the menu.

    Alternatively, from the Quick access page, click the Catalogs button, and then click the Create catalog button.

  3. Follow the instructions for creating foreign catalogs in Create catalogs.

SQL

Run the following SQL command in a notebook or the SQL query editor. Items in brackets are optional. Replace the placeholder values:

  • <catalog-name>: Name for the catalog in Azure Databricks.
  • <connection-name>: The connection object that specifies the data source, path, and access credentials.
  • <database-name>: Name of the database you want to mirror as a catalog in Azure Databricks. Not required for MySQL, which uses a two-layer namespace.
  • <external-catalog-name>: Databricks-to-Databricks only: Name of the catalog in the external Databricks workspace that you are mirroring. See Create a foreign catalog.
CREATE FOREIGN CATALOG [IF NOT EXISTS] <catalog-name> USING CONNECTION <connection-name>
OPTIONS (database '<database-name>');

For information about managing and working with foreign catalogs, see Manage and work with foreign catalogs.

Lakehouse federation and materialized views

Databricks recommends loading external data using Lakehouse Federation when you are creating materialized views. See Use materialized views in Databricks SQL.

When you use Lakehouse Federation, users can reference the federated data as follows:

CREATE MATERIALIZED VIEW xyz AS SELECT * FROM federated_catalog.federated_schema.federated_table;

View system-generated federated queries

Lakehouse Federation translates Databricks SQL statements into statements that can be pushed down into the federated data source. To view the generated SQL statement, click the foreign data source scan node on the graph view of Query Profile, or run the EXPLAIN FORMATTED SQL statement. See the Supported Pushdown section of each data source’s documentation for coverage.

Limitations

  • Queries are read-only.
  • Throttling of connections is determined using the Databricks SQL concurrent query limit. There is no limit across warehouses per connection. See Queueing and autoscaling for pro and classic SQL warehouses.
  • Tables and schemas with names that are invalid in Unity Catalog are not supported and are ignored by Unity Catalog upon creation of a foreign catalog. See the list of naming rules and limitations in Limitations.
  • Table names and schema names are converted to lowercase in Unity Catalog. Lookups must also use lowercase names. If there are tables or schemas with duplicate lowercase names, only one of the tables or schemas is imported into the foreign catalog.
  • For each foreign table referenced, Azure Databricks schedules a subquery in the remote system to return a subset of data from that table and then returns the result to one Azure Databricks executor task over a single stream. If the result set is too large, the executor might run out of memory.
  • Single user access mode is only available for users that own the connection.
  • Lakehouse Federation cannot federate foreign tables with case-sensitive identifiers for Azure Synapse connections or Redshift connections.

Resource quotas

Azure Databricks enforces resource quotas on all Unity Catalog securable objects. These quotas are listed in Resource limits. Foreign catalogs and all objects that they contain are included in your total quota usage.

If you expect to exceed these resource limits, contact your Azure Databricks account team.

You can monitor your quota usage using the Unity Catalog resource quotas APIs. See Monitor your usage of Unity Catalog resource quotas.

Additional resources