Share via


log

If there is only one argument, then this takes the natural logarithm of the argument. Supports Spark Connect.

For the corresponding Databricks SQL function, see log function.

Syntax

from pyspark.databricks.sql import functions as dbf

dbf.log(arg1=<arg1>, arg2=<arg2>)

Parameters

Parameter Type Description
arg1 pyspark.sql.Column, str or float base number or actual number (in this case base is e)
arg2 pyspark.sql.Column, str or float, optional number to calculate logariphm for.

Returns

pyspark.sql.Column: logariphm of given value.

Examples

from pyspark.databricks.sql import functions as dbf
df = spark.sql("SELECT * FROM VALUES (1), (2), (4) AS t(value)")
df.select("*", dbf.log(2.0, df.value)).show()
+-----+---------------+
|value|LOG(2.0, value)|
+-----+---------------+
|    1|            0.0|
|    2|            1.0|
|    4|            2.0|
+-----+---------------+

from pyspark.databricks.sql import functions as dbf
df = spark.sql("SELECT * FROM VALUES (1), (2), (0), (-1), (NULL) AS t(value)")
df.select("*", dbf.log(3.0, df.value)).show()
+-----+------------------+
|value|   LOG(3.0, value)|
+-----+------------------+
|    1|               0.0|
|    2|0.6309297535714...|
|    0|              NULL|
|   -1|              NULL|
| NULL|              NULL|
+-----+------------------+