Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Returns dividend/divisor. It always performs floating point division. Its result is always null if divisor is 0. Supports Spark Connect.
For the corresponding Databricks SQL function, see try_divide function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.try_divide(left=<left>, right=<right>)
Parameters
| Parameter | Type | Description |
|---|---|---|
left |
pyspark.sql.Column or column name |
dividend |
right |
pyspark.sql.Column or column name |
divisor |
Examples
from pyspark.databricks.sql import functions as dbf
spark.createDataFrame(
[(6000, 15), (1990, 2), (1234, 0)], ["a", "b"]
).select("*", dbf.try_divide("a", "b")).show()
+----+---+----------------+
| a| b|try_divide(a, b)|
+----+---+----------------+
|6000| 15| 400.0|
|1990| 2| 995.0|
|1234| 0| NULL|
+----+---+----------------+
from pyspark.databricks.sql import functions as dbf
df = spark.range(4).select(dbf.make_interval(dbf.lit(1)).alias("itvl"), "id")
df.select("*", dbf.try_divide("itvl", "id")).show()
+-------+---+--------------------+
| itvl| id|try_divide(itvl, id)|
+-------+---+--------------------+
|1 years| 0| NULL|
|1 years| 1| 1 years|
|1 years| 2| 6 months|
|1 years| 3| 4 months|
+-------+---+--------------------+