Creating Spark database in Azure Databricks with location to ADLS Gen 2 using ABFS driver throws an exception

Ramesh 50 Reputation points
2023-02-13T11:19:17.2533333+00:00

Hi Team, I am creating a database in Azure Databricks using the abfss location in the create table statement and it throws an exception.

  • Authentication to ADLS - Session Scoped Access Key Authentication as below
  • Access Method to ADLS - abfs driver as below
spark.conf.set("fs.azure.account.key.formula1supportdl.dfs.core.windows.net", 
                dbutils.secrets.get(scope="databricks-support-scope", key="formula1supportdl-account-key"))
%sql
CREATE DATABASE f1_demo_abfss
LOCATION 'abfss://******@formula1supportdl.dfs.core.windows.net/'

Error message as below

Screenshot 2023-02-13 at 11.05.50

Additional information

  • Access to the same storage account/ container works perfectly fine if I use the cluster scoped authentication. i.e., when adding the spark configuration to the cluster configuration instead of the using in the notebook.
  • Specifying the location for a table as above works perfectly fine too. This is only for a problem in create database statement.

Example CREATE TABLE with the same location works perfectly fine as below

Screenshot 2023-02-13 at 11.14.53

As this is only a problem when trying to access the location on CREATE DATABASE statement using session scoped authentication, it seem to be a bug. As I said about cluster scoped authentication works perfectly fine. Also, CREATE TABLE with the same location works perfectly fine.

Can this be looked into please? Let me know if you need more information to investigate.

Thanks

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,514 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.