Authentication of Azure Dedicated pool using client certificate from Databricks

Manish Kumar Gupta 1 Reputation point
2022-04-29T05:41:52.327+00:00

Hello Team,

I have a requirement of authenticating azure dedicated pool from databricks using SPN client certificate. I want to write data (delta format) from ADLS Gen 1 to Dedicated pool via databricks. Could you please guide on how can we use client certificate for authenticating purpose?

I want to use "com.databricks.spark.sqldw" format to write data in dedicated pool. I am able to write the data using dedicated pool username and password but I want use client certificate for authentication.

Please find below the code I am using for username and password.
df.write \
.mode("overwrite")\
.format("com.databricks.spark.sqldw") \
.option("url", "server_url:port number; database=database_name; user=user; password=password") \
.option("enableServicePrincipalAuth", "true") \
.option("forwardSparkAzureStorageCredentials", "true") \
.option("dbTable", "table_name") \
.option("tempDir", "Blob_TempDir_path") \
.save()

Kindly advice.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,263 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. ShaikMaheer-MSFT 38,466 Reputation points Microsoft Employee
    2022-05-02T16:29:57.947+00:00

    Hi @Manish Kumar Gupta ,

    Thank you for posting query in Microsoft Q&A Platform.

    If I am not wrong then Spark Synapse connector not supports SPN certificate authentication. I am checking more on this with internal team. I will get back to you with updates. Thank you.

    1 person found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.