Cluster configuration error
Hi Expert,
I am getting error in cluster configuration in databricks cluster creation
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/metastore/api/NoSuchObjectException
spark.hadoop.javax.jdo.option.ConnectionDriverName com.microsoft.sqlserver.jdbc.SQLServerDriver spark.hadoop.javax.jdo.option.ConnectionURL z1.db.windows.net,1433;Initial Catalog=Hivemetastore;TrustServerCertificate=False;Connection Timeout=30;Authentication="Active Directory Default";
Skip this one if <hive-version> is 0.13.x. spark.conf.set("spark.databricks.delta.schema.autoMerge.enabled ","true") spark.databricks.delta.preview.enabled true spark.hadoop.javax.jdo.option.ConnectionUserName test spark.conf.set("spark.databricks.parquet.schema.autoMerge.enabled ","true") datanucleus.fixedDatastore false spark.hadoop.javax.jdo.option.ConnectionPassword "dd"
spark.sql("SET spark.databricks.delta.schema.autoMerge.enabled = true") hive.metastore.schema.verification.record.version to enable datanucleus.autoCreateSchema true
#spark.sql.hive.metastore.jars to builtin spark.sql.hive.metastore.jars set to maven hive.metastore.schema.verification false spark.sql.hive.metastore.version =1.2.0 #spark.sql.hive.metastore.version 2.3.0
is there anything i am missing in configuration