Share via


unix_timestamp

Convert time string with given pattern ('yyyy-MM-dd HH:mm:ss', by default) to Unix time stamp (in seconds), using the default timezone and the default locale, returns null if failed. If timestamp is None, then it returns current timestamp.

For the corresponding Databricks SQL function, see unix_timestamp function.

Syntax

import pyspark.sql.functions as sf

# Returns current timestamp
sf.unix_timestamp()

# With timestamp
sf.unix_timestamp(timestamp=<timestamp>)

# With timestamp and format
sf.unix_timestamp(timestamp=<timestamp>, format=<format>)

Parameters

Parameter Type Description
timestamp pyspark.sql.Column or str Optional. Timestamps of string values.
format str Optional. Alternative format to use for converting (default: yyyy-MM-dd HH:mm:ss).

Returns

pyspark.sql.Column: unix time as long integer.

Examples

Example 1: Returns the current timestamp in UNIX.

import pyspark.sql.functions as sf
spark.range(1).select(sf.unix_timestamp()).show()
+----------+
| unix_time|
+----------+
|1702018137|
+----------+

Example 2: Using default format 'yyyy-MM-dd HH:mm:ss' parses the timestamp string.

import pyspark.sql.functions as sf
df = spark.createDataFrame([('2015-04-08 12:12:12',)], ['ts'])
df.select('*', sf.unix_timestamp('ts')).show()
+-------------------+---------------------------------------+
|                 ts|unix_timestamp(ts, yyyy-MM-dd HH:mm:ss)|
+-------------------+---------------------------------------+
|2015-04-08 12:12:12|                             1428520332|
+-------------------+---------------------------------------+

Example 3: Using user-specified format 'yyyy-MM-dd' parses the timestamp string.

import pyspark.sql.functions as sf
df = spark.createDataFrame([('2015-04-08',)], ['dt'])
df.select('*', sf.unix_timestamp('dt', 'yyyy-MM-dd')).show()
+----------+------------------------------+
|        dt|unix_timestamp(dt, yyyy-MM-dd)|
+----------+------------------------------+
|2015-04-08|                    1428476400|
+----------+------------------------------+