Share via


Read Databricks tables from Delta clients

This page provides an overview of using the Unity REST API to access Unity Catalog managed and external tables from external Delta clients. To create external Delta tables from external clients, see Create external Delta tables from external clients.

Use the Iceberg REST catalog to read Unity Catalog-registered tables on Azure Databricks from supported Iceberg clients, including Apache Spark and DuckDB.

For a full list of supported integrations, see Unity Catalog integrations.

Tip

For information about how to read Azure Databricks data using Microsoft Fabric, see Use Microsoft Fabric to read data that is registered in Unity Catalog.

Read and write using the Unity REST API

The Unity REST API provides external clients read access to tables registered to Unity Catalog. Some clients also support creating tables and writing to existing tables.

Configure access using the endpoint /api/2.1/unity-catalog.

Requirements

Azure Databricks supports Unity REST API access to tables as part of Unity Catalog. You must have Unity Catalog enabled in your workspace to use these endpoints. The following table types are eligible for Unity REST API reads:

  • Unity Catalog managed tables.
  • Unity Catalog external tables.

You must complete the following configuration steps to configure access to read Databricks objects from Delta clients using the Unity REST API:

Read Delta tables with Apache Spark

The following configuration is required to read Unity Catalog managed and external Delta tables with Apache Spark:

"spark.sql.extensions": "io.delta.sql.DeltaSparkSessionExtension",
"spark.sql.catalog.spark_catalog": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>.uri": "<workspace-url>/api/2.1/unity-catalog",
"spark.sql.catalog.<uc-catalog-name>.token": "<token>",
"spark.sql.defaultCatalog": "<uc-catalog-name>",
"spark.jars.packages": "io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0,org.apache.hadoop:hadoop-azure:3.3.6"

Substitute the following variables:

  • <uc-catalog-name>: The name of the catalog in Unity Catalog that contains your tables.
  • <workspace-url>: URL of the Azure Databricks workspace.
  • <token>: PAT token for the principal configuring the integration.

Note

The package versions shown above are current as of the last update to this page. Newer versions may be available. Verify that package versions are compatible with your Databricks Runtime version and Spark version.

For additional details about configuring Apache Spark for cloud object storage, see the Unity Catalog OSS documentation.