Hello @Sharukh Kundagol - Thanks for the question and using MS Q&A platform.
The error you are seeing is due to the fact that the method getLocalProperty
is not whitelisted on the JavaSparkContext
class. This is a known issue with High Concurrency clusters in Databricks. One way to resolve this issue is to use a Standard cluster instead of a High Concurrency cluster. However, if you do not have the option to change the cluster type, you can try the following workaround:
- Open the notebook where you are seeing the error.
- Click on the "Edit" button to edit the notebook.
- Add the following line of code at the beginning of the notebook:
spark.conf.set("spark.driver.extraJavaOptions", "-Dio.netty.tryReflectionSetAccessible=true")
- Save the notebook and try running it again.
Hope this helps. Do let us know if you any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.