question

TobiasHimmighfer-2818 avatar image
0 Votes"
TobiasHimmighfer-2818 asked WilliamEngelen-5432 commented

Handshake fails trying to connect from Azure Databricks to Azure PostgreSQL with SSL

We're trying to connect to an Azure Database for PostgreSQL flexible server from Databricks using the jdbc driver org.postgresql.Driver.
Since the flexible server enforces SSL, we added the ssl and sslmode options to our existing code:

 driver = "org.postgresql.Driver"
 url = "jdbc:postgresql://<server>/<db>"
 user = "<user>"
 password = "<password>"
 query = "<someQuery>"

 remote_table = spark.read.format("jdbc") \
   .option("driver", driver) \
   .option("url", url) \
   .option("user", user) \
   .option("password", password) \
   .option("query", query) \
   .option("ssl", True) \
   .option("sslmode", "require" ) \
   .load()

The error we get is

 org.postgresql.util.PSQLException: SSL error: Received fatal alert: handshake_failure

How do we establish an SSL connection to our Postgres using the above code? Any help would be greatly appreciated.

We know this can be done using psycopg2 and sslmode="require" (and did so successfully in a separate notebook in order to verify that our password is correct and the firewall is configured accordingly), but we'd really prefer to integrate the process into our existing solution with as little changes as possible.

Thanks in advance.

azure-databricksazure-database-postgresql
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

PRADEEPCHEEKATLA-MSFT avatar image
3 Votes"
PRADEEPCHEEKATLA-MSFT answered WilliamEngelen-5432 commented

Hello @TobiasHimmighfer-2818,

Welcome to the Microsoft Q&A platform.

As per my repro, when I executed the above command for first time I was experiencing the same problem.

41749-image.png

Solution: It started working by overwriting the java security properties for driver and executor.

 spark.driver.extraJavaOptions -Djava.security.properties=    
 spark.executor.extraJavaOptions -Djava.security.properties=

41780-image.png

Reason:

What is happening in reality, is that the “security” variable of the JVM is reading by default the following file (/databricks/spark/dbconf/java/extra.security) and in this file, there are some TLS algorithms that are being disabled by default. That means that if I edit this file and replace the TLS ciphers that work for Postgres flexible server for an empty string that should also work.

When I set this variable to the executors (spark.executor.extraJavaOptions) it will not change the default variables from the JVM. The same does not happen for the driver which overwrites and so it starts to work.

41891-image.png

Hope this helps. Do let us know if you any further queries.


  • Please accept an answer if correct. Original posters help the community find answers faster by identifying the correct answer. Here is how.

  • Want a reminder to come back and check responses? Here is how to subscribe to a notification.



image.png (60.6 KiB)
image.png (98.5 KiB)
image.png (100.3 KiB)
· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Works like a charm, thank you so much for your answer and the detailed explanation.

0 Votes 0 ·

Hello @TobiasHimmighfer-2818,

Glad to know that it helped.

0 Votes 0 ·