Handshake fails trying to connect from Azure Databricks to Azure PostgreSQL with SSL

Tobias Himmighöfer 21 Reputation points

We're trying to connect to an Azure Database for PostgreSQL flexible server from Databricks using the jdbc driver org.postgresql.Driver.
Since the flexible server enforces SSL, we added the ssl and sslmode options to our existing code:

driver = "org.postgresql.Driver"
url = "jdbc:postgresql://<server>/<db>"
user = "<user>"
password = "<password>"
query = "<someQuery>"

remote_table = spark.read.format("jdbc") \
  .option("driver", driver) \
  .option("url", url) \
  .option("user", user) \
  .option("password", password) \
  .option("query", query) \
  .option("ssl", True) \
  .option("sslmode", "require" ) \

The error we get is

org.postgresql.util.PSQLException: SSL error: Received fatal alert: handshake_failure

How do we establish an SSL connection to our Postgres using the above code? Any help would be greatly appreciated.

We know this can be done using psycopg2 and sslmode="require" (and did so successfully in a separate notebook in order to verify that our password is correct and the firewall is configured accordingly), but we'd really prefer to integrate the process into our existing solution with as little changes as possible.

Thanks in advance.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,601 questions
Azure Database for PostgreSQL
0 comments No comments
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 66,066 Reputation points Microsoft Employee

    Hello @Tobias Himmighöfer ,

    Welcome to the Microsoft Q&A platform.

    As per my repro, when I executed the above command for first time I was experiencing the same problem.


    Solution: It started working by overwriting the java security properties for driver and executor.

    spark.driver.extraJavaOptions -Djava.security.properties=      
    spark.executor.extraJavaOptions -Djava.security.properties=  



    What is happening in reality, is that the “security” variable of the JVM is reading by default the following file (/databricks/spark/dbconf/java/extra.security) and in this file, there are some TLS algorithms that are being disabled by default. That means that if I edit this file and replace the TLS ciphers that work for Postgres flexible server for an empty string that should also work.

    When I set this variable to the executors (spark.executor.extraJavaOptions) it will not change the default variables from the JVM. The same does not happen for the driver which overwrites and so it starts to work.


    Hope this helps. Do let us know if you any further queries.


    • Please accept an answer if correct. Original posters help the community find answers faster by identifying the correct answer. Here is how.
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification.
    3 people found this answer helpful.

0 additional answers

Sort by: Most helpful