I spent days looking at this and then now just found the answer... spark in synapse use a folder within ADLS storage account for the workspace to save meta information, logs etc. it's in the root of the container /synapase/workspaces/<workspacename>. In my case... (this is a test environment) somehow someone or something created another <workspacename> directory with ALL CAPS. So in /synapse/workspaces/ there were two directories. One was <WORKSPACENAME>... one was <workspacename>. The all caps was timestamped as being newer from a few weeks ago and didn't match the actual case of the workspace. So I deleted. Once deleted everything worked as it should.
Can't create/show databases in azure synapse notebook
when running in a synapse notebook as myself within our workspace ...
%%sql
SHOW DATABASES
An error is given... "Error: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to create path: abfss://<our-fs>@<our-sa>.dfs.core.windows.net/synapse/workspaces/<our-ws>/warehouse)"
Within lake database designer I can create a database ... and see it as well as the default database. Also, spark can read and write data to and from the workspaces ADLS storage accoutn using my AD account within a notebook.
But saving any table, e.g. spark.sql("create table example(a int, b int)") errors with same error. spark.sql("show databases") as I have coded above produces the error. Running "create database <dbname>" within notebook also produces error. Essentially any command we try to access or alter the metastore within a notebook produces the above error.
My security within this workspace is Synapse Administrator and I have Blob Storage Contributor role to the storage. Can you let me know what things to check to see what may be wrong with our environment?