Share via


Temporary tables

Applies to: check marked yes Databricks SQL

Important

This feature is in Public Preview. You can confirm preview enrollment on the Previews page. See Manage Azure Databricks previews.

This page describes how to use temporary tables for session-scoped intermediate data storage and analysis on SQL warehouse compute.

Temporary tables store data for the duration of a Azure Databricks session. Use temporary tables to materialize intermediate results for exploratory analysis or SQL data pipelines without creating permanent tables in your catalog.

Temporary tables are available on SQL warehouse compute only. Classic compute and serverless compute are not yet supported.

When to use temporary tables

Use temporary tables when you need to store short-lived intermediate data during exploratory analysis or workflow development, reuse query results across multiple operations within the same session, or work with a table-like interface without adding objects to your catalog namespace.

If your data needs to persist beyond the current session or must be shared with other users or jobs, use a permanent Unity Catalog table instead. See Unity Catalog managed tables in Azure Databricks for Delta Lake and Apache Iceberg.

Create a temporary table

Create a temporary table using CREATE TEMPORARY TABLE or CREATE TEMP TABLE syntax. You can create an empty table with a defined schema or create a table from query results.

-- Create an empty temporary table with a defined schema
CREATE TEMPORARY TABLE temp_customers (
  id INT,
  name STRING,
  email STRING
);

-- Create a temporary table from query results
CREATE TEMP TABLE temp_recent_orders AS
SELECT order_id, customer_id, order_date, amount
FROM prod.sales.orders
WHERE order_date >= current_date() - INTERVAL 30 DAYS;

-- Create a temporary table using VALUES clause
CREATE TEMP TABLE temp_test_data AS
VALUES
  (9001, 101, 50.00),
  (9002, 204, 75.00),
  (9003, 101, 25.00)
AS t(order_id, customer_id, amount);

Note

  • CREATE OR REPLACE TEMP TABLE syntax isn't yet supported. To replace a temporary table, drop it first and then recreate it.
  • Don't specify the USING clause when creating temporary tables. Temporary tables use the Delta format by default, and explicitly specifying a format causes an error.

Query temporary tables

Reference temporary tables using a single-part name. Don't specify a catalog or schema.

-- Query a temporary table
SELECT * FROM temp_customers;

-- Join temporary tables with permanent tables
SELECT
  c.name,
  o.order_id,
  o.amount
FROM temp_customers c
INNER JOIN temp_recent_orders o
  ON c.id = o.customer_id;

When you reference a table with a single-part name, Azure Databricks searches in the following order:

  1. Temporary tables in the current session
  2. Permanent tables in the current schema

If you create a temporary table with the same name as an existing permanent table, the temporary table takes precedence within your session. To explicitly reference a permanent table that has the same name as a temporary table, use a fully qualified three-part name:

-- References temporary table (if it exists)
SELECT * FROM customers;

-- Explicitly references permanent table
SELECT * FROM prod.sales.customers;

For more information about name resolution in Azure Databricks, see Name resolution.

Modify temporary tables

You can insert, update, and merge data into temporary tables using standard DML operations.

-- Insert data into a temporary table
INSERT INTO temp_customers VALUES (101, 'Jane Doe', 'jane@example.com');

-- Insert from a query
INSERT INTO temp_customers
SELECT id, name, email
FROM prod.customer.active_customers
WHERE region = 'US-WEST';

-- Update rows in a temporary table
UPDATE temp_recent_orders
SET amount = amount * 0.90
WHERE customer_id = 101;

-- Merge data into a temporary table
MERGE INTO temp_customers target
USING prod.customer.new_signups source
ON target.id = source.id
WHEN MATCHED THEN UPDATE SET *
WHEN NOT MATCHED THEN INSERT *;

Note

DELETE FROM operations are not yet supported for temporary tables. Use MERGE INTO with filtering conditions or recreate the table with filtered data instead.

Drop temporary tables

Azure Databricks automatically drops temporary tables when your session ends, but you can explicitly drop them if you no longer need them.

-- Drop a temporary table
DROP TEMP TABLE temp_customers;

-- Drop only if it exists
DROP TEMP TABLE IF EXISTS temp_recent_orders;

Lifecycle and time limits

Temporary tables exist only within the Azure Databricks session where you create them. The maximum lifetime is seven days from session creation. Temporary tables become inaccessible when the session ends or after seven days, whichever comes first. These limits apply to notebooks, SQL Editor, jobs, and JDBC/ODBC sessions.

Storage and cleanup

Azure Databricks automatically manages temporary table storage. Data persists in cloud storage, similar to Unity Catalog managed tables. When a temporary table becomes inaccessible (when the session ends or the seven-day limit is reached), you can no longer query the data. Azure Databricks automatically reclaims storage in the background, usually within a few days. Cleanup happens automatically even with disconnection or unexpected cluster shutdown.

Isolation and privileges

Any user can create temporary tables. You don't need CREATE TABLE privileges on a catalog or schema in Unity Catalog.

Temporary tables provide session-level isolation. Each temporary table exists only within the session that created it. No other users can read, modify, or detect the existence of your temporary tables.

You can create a temporary table with the same name as an existing permanent table without collision. For details about how Azure Databricks resolves table names, see Query temporary tables.

Temporary tables share a namespace with temporary views. You can't create both a temporary table and a temporary view with the same name in the same session.

Limitations

Temporary tables have the following limitations:

  • Schema modifications: ALTER TABLE operations are not supported. To modify a temporary table's schema, drop and recreate the table.
  • Cloning: Shallow and deep cloning are not supported.
  • Time travel: Time travel queries are not supported.
  • Streaming: You can't use temporary tables in streaming queries (for example, in foreachBatch operations).
  • API support: Only SQL APIs are supported. DataFrame APIs are not supported.
  • Notebook multi-user access: Only one user can interact with temporary tables in a notebook session. A second user cannot read, write, or create temporary tables in the same session.

What's next