Manage the default catalog

This article introduces the default Unity Catalog catalog, explains how to decide which catalog to use as the default, and shows how to change it.

What is the default catalog in Unity Catalog?

A default catalog is configured for each workspace that is enabled for Unity Catalog. The default catalog lets you perform data operations without specifying a catalog. If you omit the top-level catalog name when you perform data operations, the default catalog is assumed.

A workspace admin can view or switch the default catalog using the Admin Settings UI. You can also set the default catalog for a cluster using a Spark config.

Commands that do not specify the catalog (for example GRANT CREATE TABLE ON SCHEMA myschema TO mygroup) are evaluated for the catalog in the following order:

  1. Is the catalog set for the session using a USE CATALOG statement or a JDBC setting?
  2. Is the Spark configuration spark.databricks.sql.initial.catalog.namespace set on the cluster?
  3. Is there a workspace default catalog set for the cluster?

The default catalog configuration when Unity Catalog is enabled

The default catalog that was initially configured for your workspace depends on how your workspace was enabled for Unity Catalog:

  • For some workspaces that were enabled for Unity Catalog automatically, the workspace catalog was set as the default catalog. See Automatic enablement of Unity Catalog.
  • For all other workspaces, the hive_metastore catalog was set as the default catalog.

If you are transitioning from the Hive metastore to Unity Catalog within an existing workspace, it typically makes sense to use hive_metastore as the default catalog to avoid impacting existing code that references the Hive metastore.

Change the default catalog

A workspace admin can change the default catalog for the workspace. Anyone with permission to create or edit a compute resource can set a different default catalog for the compute resource.

Warning

Changing the default catalog can break existing data operations that depend on it.

To configure a different default catalog for a workspace:

  1. Log in to your workspace as a workspace admin.
  2. Click your username in the top bar of the workspace and select Admin Settings from the dropdown.
  3. Click the Advanced tab.
  4. On the Default catalog for the workspace row, enter the catalog name and click Save.

Restart your SQL warehouses and clusters for the change to take effect. All new and restarted SQL warehouses and clusters will use this catalog as the workspace default.

You can also override the default catalog for a specific cluster by setting the following Spark configuration on the cluster. This approach is not available for SQL warehouses:

spark.databricks.sql.initial.catalog.name

For instructions, see Spark configuration.

View the current default catalog

To get the current default catalog for your workspace, you can use a SQL statement in a notebook or SQL Editor query. A workspace admin can get the default catalog using the Admin Settings UI.

Admin Settings

  1. Log in to your workspace as a workspace admin.
  2. Click your username in the top bar of the workspace and select Admin Settings from the dropdown.
  3. Click the Advanced tab.
  4. On the Default catalog for the workspace row, view the catalog name.

SQL

Run the following command in a notebook or SQL Editor query that is running on a SQL warehouse or Unity Catalog-compliant cluster. The workspace default catalog is returned as long as no USE CATALOG statement or JDBC setting has been set on the session, and as long as no spark.databricks.sql.initial.catalog.namespace config is set for the cluster.

SELECT current_catalog();