The error giving the table name was relevant... the calling procedure was passing the table rather than the path
IsDelta = True, Merge gets error "Not a delta table"
I'm running a Spark 2.4 cluster for which we've created a delta table and the function IsDelta returns true for the table (well, path), however, when trying to run an merge statement, we are getting the error "<table> is not a Delta table.". The table is referenced via its path for the merge so the creation/validation and merge are all via the path method but it does give error as "table" rather than "path", not sure if that makes a difference
create statement
df.write.format("delta").mode("overwrite").option("path", path).saveAsTable(table)
merge statement
deltaUpdate.merge(...)
deltaUpdate.whenNotMatchedInsertAll()
deltaUpdate.execute()
Merge statement works fine as has been used on multiple other tables without issue.
If I query the table via either spark.sql or %%sql, they both query the table fine giving indications that the table is created properly so why am I getting the error and how to get around it (short of re-creating the table)?