I was able to connect only using "useAzureMSI". I think it is because of the VNETs we have. Thank you for all the help. This issue is solved now.
Does Azure Synapse connector for Azure Databricks work for Dedicated SQL pool (formerly SQL DW)
I have been trying to write a dataframe that is transformed in Azure Databricks into the Dedicated SQL Pool (formarly SQL DW). We are not using synapse so far in production.
I am following the instructions in the page https://learn.microsoft.com/en-us/azure/databricks/data/data-sources/azure/synapse-analytics#parameters
But I can't get it to work. My code creates parquet file in the ADLS Gen2 folder (based on the data in the dataframe). But then while it tries to copy from the intermediate storage to DW, it fails with the error:
com.microsoft.sqlserver.jdbc.SQLServerException: Not able to validate external location because The remote server returned an error: (403) Forbidden.
I am using below code:
val blobStorage = "myadls.dfs.core.windows.net"
val blobContainer = "udl-container"
val blobAccessKey = dbutils.secrets.get(scope = "mykeyvault", key = "key-ADLS2")
val tempDir = "abfss://" + blobContainer + "@" + blobStorage +"/MyPath"
val acntInfo = "fs.azure.account.key."+ blobStorage
sc.hadoopConfiguration.set(acntInfo, blobAccessKey)
val dwDatabase = "mydwdatabase"
val dwServer = "mydwserver.database.windows.net"
val dwUser = "dwUserId"
val dwPass = "dwUserPwd"
val dwJdbcPort = "1433"
val dwJdbcExtraOptions = "encrypt=true;trustServerCertificate=true;hostNameInCertificate=*.database.windows.net;loginTimeout=30;"
val sqlDwUrl = "jdbc:sqlserver://" + dwServer + ":" + dwJdbcPort + ";database=" + dwDatabase + ";user=" + dwUser+";password=" + dwPass + ";$dwJdbcExtraOptions"
val sqlDwUrlSmall = "jdbc:sqlserver://" + dwServer + ":" + dwJdbcPort + ";database=" + dwDatabase + ";user=" + dwUser+";password=" + dwPass
spark.conf.set(
"spark.sql.parquet.writeLegacyFormat",
"true")
fa_df.write
.format("com.databricks.spark.sqldw")
.option("url", sqlDwUrlSmall)
.option("dbtable", "mydwTable")
.option("forwardSparkAzureStorageCredentials","true")
.option("tempDir", tempDir)
.mode("overwrite")
.save()