Share via


Read Databricks tables from Delta clients

This page provides an overview of using the Unity REST API to access Unity Catalog managed and external tables from external Delta clients. To create external Delta tables from external clients, see Create external Delta tables from external clients.

Use the Iceberg REST catalog to read Unity Catalog-registered tables on Azure Databricks from supported Iceberg clients, including Apache Spark and DuckDB.

For a full list of supported integrations, see Unity Catalog integrations.

Tip

For information about how to read Azure Databricks data using Microsoft Fabric, see Use Microsoft Fabric to read data that is registered in Unity Catalog.

Read and write using the Unity REST API

The Unity REST API provides external clients read access to tables registered to Unity Catalog. Some clients also support creating tables and writing to existing tables.

Configure access using the endpoint /api/2.1/unity-catalog.

Requirements

Azure Databricks supports Unity REST API access to tables as part of Unity Catalog. You must have Unity Catalog enabled in your workspace to use these endpoints. The following table types are eligible for Unity REST API reads:

  • Unity Catalog managed tables.
  • Unity Catalog external tables.

You must complete the following configuration steps to configure access to read Databricks objects from Delta clients using the Unity REST API:

Read Delta tables with Apache Spark using PAT authentication

The following configuration is required to read Unity Catalog managed and external Delta tables with Apache Spark using PAT authentication:

"spark.sql.extensions": "io.delta.sql.DeltaSparkSessionExtension",
"spark.sql.catalog.spark_catalog": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>.uri": "<workspace-url>/api/2.1/unity-catalog",
"spark.sql.catalog.<uc-catalog-name>.token": "<token>",
"spark.sql.defaultCatalog": "<uc-catalog-name>",
"spark.jars.packages": "io.delta:delta-spark_2.13:4.0.1,io.unitycatalog:unitycatalog-spark_2.13:0.3.1,org.apache.hadoop:hadoop-azure:3.3.6"

Substitute the following variables:

  • <uc-catalog-name>: The name of the catalog in Unity Catalog that contains your tables.
  • <workspace-url>: URL of the Azure Databricks workspace.
  • <token>: Personal access token (PAT) for the principal configuring the integration.

To enable automatic credential renewal for long-running jobs, add the following configuration:

"spark.sql.catalog.<catalog-name>.renewCredential.enabled": true

Note

The package versions shown above are current as of the last update to this page. Newer versions may be available. Verify that package versions are compatible with your Databricks Runtime version and Spark version.

For additional details about configuring Apache Spark for cloud object storage, see the Unity Catalog OSS documentation.

Read Delta tables with Apache Spark using OAuth authentication

Azure Databricks also supports OAuth machine-to-machine (M2M) authentication. OAuth automatically handles token renewal for Unity Catalog authentication. For long-running jobs that also require automatic cloud storage credential renewal, enable the spark.sql.catalog.<uc-catalog-name>.renewCredential.enabled setting in your Spark configuration.

OAuth authentication for external Spark clients requires:

The following configuration is required to read Unity Catalog managed tables and external Delta tables with Apache Spark using OAuth authentication:

"spark.sql.extensions": "io.delta.sql.DeltaSparkSessionExtension",
"spark.sql.catalog.spark_catalog": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>.uri": "<workspace-url>/api/2.1/unity-catalog",
"spark.sql.catalog.<uc-catalog-name>.auth.type": "oauth",
"spark.sql.catalog.<uc-catalog-name>.auth.oauth.uri": "<oauth-token-endpoint>",
"spark.sql.catalog.<uc-catalog-name>.auth.oauth.clientId": "<oauth-client-id>",
"spark.sql.catalog.<uc-catalog-name>.auth.oauth.clientSecret": "<oauth-client-secret>",
"spark.sql.catalog.<uc-catalog-name>.renewCredential.enabled": "true",
"spark.sql.defaultCatalog": "<uc-catalog-name>",
"spark.jars.packages": "io.delta:delta-spark_2.13:4.0.1,io.unitycatalog:unitycatalog-spark_2.13:0.3.1,org.apache.hadoop:hadoop-azure:3.3.6"

Substitute the following variables:

Note

The package versions shown above are current as of the last update to this page. Newer versions may be available. Verify that package versions are compatible with your Spark version.