Databricks Spark Scala: RaiseError throws type error

bn2302 0 Reputation points
2024-06-05T08:24:16.67+00:00

Hello,

I am facing an issue in Databricks 14.3-LTS within a Scala notebook. When I try to raise an Exception using Spark Catalyst with the following Scala code:


import org.apache.spark.sql.types.{StringType, DateType}

val errorMessage = Literal.create("error message text", StringType)

val errorExpression = RaiseError(errorMessage, DateType)

I receive this error:


error: type mismatch;

found : org.apache.spark.sql.types.DateType.type

required: org.apache.spark.sql.catalyst.expressions.Expression

val errorExpression = RaiseError(errorMessage, DateType)

However, this code executes as expected using Open Source Apache Spark 3.5.0.

Can anyone suggest what might be wrong?

Additionally, how can I check the sources of the installed Spark libraries in the workspace?

Thank you for your assistance.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,017 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA-MSFT 82,271 Reputation points Microsoft Employee
    2024-06-05T10:55:55.8233333+00:00

    @bn2302 - Thanks for the question and using MS Q&A platform.

    The error in your code is caused by the missing import statement for the DateType class. To fix the error, you need to add the import statement for DateType from the org.apache.spark.sql.types package. Here's the updated code:

    import org.apache.spark.sql.catalyst.expressions.{Literal, RaiseError}
    import org.apache.spark.sql.types.{StringType, DateType}
    
    val errorMessage = Literal.create("error message text", StringType)
    val errorExpression = RaiseError(errorMessage, Literal.create(null, DateType))
    
    

    By adding the import statement for DateType, the error should be resolved.

    User's image

    Additionally, how can I check the sources of the installed Spark libraries in the workspace?

    Use the Databricks CLI or the Databricks REST API to list installed libraries.

    For more details, refer to Get the list of installed libraries across all databricks workspace clusters.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.