Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Converts the number of seconds from unix epoch (1970-01-01 00:00:00 UTC) to a string representing the timestamp of that moment in the current system time zone in the given format.
For the corresponding Databricks SQL function, see from_unixtime function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.from_unixtime(timestamp=<timestamp>, format=<format>)
Parameters
| Parameter | Type | Description |
|---|---|---|
timestamp |
pyspark.sql.Column or str |
column of unix time values. |
format |
literal string, optional |
format to use to convert to (default: yyyy-MM-dd HH:mm:ss) |
Returns
pyspark.sql.Column: formatted timestamp as string.
Examples
spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([(1428476400,)], ['unix_time'])
df.select('*', dbf.from_unixtime('unix_time')).show()
spark.conf.unset("spark.sql.session.timeZone")