The error you're seeing is a Py4J security exception.The method org.apache.spark.api.java.JavaRDD.rdd()
is not in the whitelist, so you're getting this error. This method converts a Java RDD to a Scala RDD, but it's not always necessary to call it explicitly.
I assume you want to execute an SQL query on Delta tables, so you can convert the result to JSON, and extract the VERSION
field. To achieve this, you might not need to convert the JavaRDD explicitly.
Instead of converting the DataFrame result to JSON and then collecting it, you can directly collect the DataFrame result and extract the VERSION
field.
# Run SQL query
result = spark.sql("SELECT VERSION FROM (DESCRIBE HISTORY delta.`/sampDelta/names/` LIMIT 1)")
# Collect result and get the VERSION value
version = result.collect()[0]["VERSION"]
print(version)
# Set the widget value
dbutils.widgets.text('vers', str(version))