REPLACE TABLE AS SELECT is not working with parquet whereas it works fine for delta

Dhruv Singla 130 Reputation points
2024-02-08T06:46:52.4266667+00:00

I am working on Azure Databricks, with Databricks Runtime version being - 14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12). I am facing the following issue. Suppose I have a view named v1 and a database f1_processed created from the following command

CREATE DATABASE IF NOT EXISTS f1_processed
LOCATION "abfss://******@formula1dl679student.dfs.core.windows.net/"

Then if I run the following command it runs fine

CREATE OR REPLACE TABLE f1_processed.circuits
AS
SELECT * FROM v1;

However, if I specify the format like in the following code

CREATE OR REPLACE TABLE f1_processed.circuits
USING PARQUET
AS
SELECT * FROM v1;

An error is thrown

[UNSUPPORTED_FEATURE.TABLE_OPERATION] The feature is not supported: 
Table `spark_catalog`.`f1_processed`.`circuits` does not support REPLACE TABLE AS SELECT. 
Please check the current catalog and namespace to make sure the qualified table name is expected, 
and also check the catalog implementation which is configured by "spark.sql.catalog". SQLSTATE: 0A000

As seen from the first command, REPLACE TABLE AS SELECT is supported and the error is wrong. Any help is appreciated.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,514 questions
0 comments No comments
{count} votes

Accepted answer
  1. Richard Swinbank 527 Reputation points MVP
    2024-02-08T18:15:24.08+00:00

    I believe this is as expected. The Databricks documentation for REPLACE (https://docs.databricks.com/en/sql/language-manual/sql-ref-syntax-ddl-create-table-using.html) indicates that CREATE OR REPLACE TABLE is only supported for Delta Lake tables -- you can only use this syntax to replace one Delta table with another Delta table.

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.