How to use a different version of a Spark Java library dependency (antlr4) in a Databricks notebook?

Martin Medina 5 Reputation points
2024-01-22T22:32:40.6433333+00:00

Hello. I need to use in a Databricks notebook a custom made Java library which depends on Drools v8.40.1.Final which depends on ANTLR4 v4.10.1. When I try to invoke a method in my Java library I get the following error: "ANTLR Tool version 4.10.1 used for code generation does not match the current runtime version 4.9.3. UnsupportedOperationException: java.io.InvalidClassException: org.antlr.v4.runtime.atn.ATN; Could not deserialize ATN with version 4 (expected 3)". What I have found out searching the web is that the Databricks runtime Spark Java libraries uses ANTLR4 4.9.3, which is incompatible with the version used by my Java library (https://github.com/graphql-java-kickstart/graphql-java-tools/issues/767#issuecomment-1803407930). Also, that the Databricks Spark Java libraries cannot be overriden and if I add conflicting libraries, the Spark driver libraries take precedence (https://docs.databricks.com/en/workflows/jobs/how-to/use-jars-in-workflows.html#manage-library-dependencies). Do you know if there is a way to make my library works? I know that I could downgrade to a Drools version that uses an older ANTLR4 library. In fact, Drools v8.30.0.Final uses ANTLR4 v4.9.2, which works in Databricks clusters with runtime v14.2. But I have a non functional requirement to use a specific Drools version which has support.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,514 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Martin Medina 5 Reputation points
    2024-01-26T00:21:10.34+00:00

    We ended up using shading with a maven plugin to change the antlr4 parent package, so it doesn't collide with the library that the Spark runtime uses.

    1 person found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.