Nota
Capaian ke halaman ini memerlukan kebenaran. Anda boleh cuba mendaftar masuk atau menukar direktori.
Capaian ke halaman ini memerlukan kebenaran. Anda boleh cuba menukar direktori.
Returns the number of seconds since 1970-01-01 00:00:00 UTC. Truncates higher levels of precision.
For the corresponding Databricks SQL function, see unix_seconds function.
Syntax
from pyspark.sql import functions as dbf
dbf.unix_seconds(col=<col>)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
input column of values to convert. |
Returns
pyspark.sql.Column: the number of seconds since 1970-01-01 00:00:00 UTC.
Examples
spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
from pyspark.sql import functions as dbf
df = spark.createDataFrame([('2015-07-22 10:00:00',), ('2022-10-09 11:12:13',)], ['ts'])
df.select('*', dbf.unix_seconds(dbf.to_timestamp('ts'))).show()
spark.conf.unset("spark.sql.session.timeZone")