Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Returns the UNIX timestamp of the given time.
For the corresponding Databricks SQL function, see to_unix_timestamp function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.to_unix_timestamp(timestamp=<timestamp>, format=<format>)
Parameters
| Parameter | Type | Description |
|---|---|---|
timestamp |
pyspark.sql.Column or str |
Input column or strings. |
format |
pyspark.sql.Column or str, optional |
format to use to convert UNIX timestamp values. |
Examples
spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('2015-04-08 12:12:12',)], ['ts'])
df.select('*', dbf.to_unix_timestamp('ts')).show()
df = spark.createDataFrame([('2015-04-08',)], ['dt'])
df.select('*', dbf.to_unix_timestamp(df.dt, dbf.lit('yyyy-MM-dd'))).show()
df = spark.createDataFrame(
[('2015-04-08', 'yyyy-MM-dd'), ('2025+01+09', 'yyyy+MM+dd')], ['dt', 'fmt'])
df.select('*', dbf.to_unix_timestamp('dt', 'fmt')).show()
spark.conf.unset("spark.sql.session.timeZone")