Κοινοποίηση μέσω


make_timestamp_ltz

Create the current timestamp with local time zone from years, months, days, hours, mins, secs and timezone fields. If the configuration spark.sql.ansi.enabled is false, the function returns NULL on invalid inputs. Otherwise, it will throw an error instead.

Syntax

from pyspark.databricks.sql import functions as dbf

dbf.make_timestamp_ltz(years=<years>, months=<months>, days=<days>, hours=<hours>, mins=<mins>, secs=<secs>, timezone=<timezone>)

Parameters

Parameter Type Description
years pyspark.sql.Column or str The year to represent, from 1 to 9999
months pyspark.sql.Column or str The month-of-year to represent, from 1 (January) to 12 (December)
days pyspark.sql.Column or str The day-of-month to represent, from 1 to 31
hours pyspark.sql.Column or str The hour-of-day to represent, from 0 to 23
mins pyspark.sql.Column or str The minute-of-hour to represent, from 0 to 59
secs pyspark.sql.Column or str The second-of-minute and its micro-fraction to represent, from 0 to 60. The value can be either an integer like 13 , or a fraction like 13.123. If the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp.
timezone pyspark.sql.Column or str, optional The time zone identifier. For example, CET, UTC and etc.

Returns

pyspark.sql.Column: A new column that contains a current timestamp.

Examples

spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887, 'CET']],
['year', 'month', 'day', 'hour', 'min', 'sec', 'tz'])
df.select(
dbf.make_timestamp_ltz(df.year, df.month, 'day', df.hour, df.min, df.sec, 'tz')
).show(truncate=False)
df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887, 'CET']],
['year', 'month', 'day', 'hour', 'min', 'sec', 'tz'])
df.select(
dbf.make_timestamp_ltz(df.year, df.month, 'day', df.hour, df.min, df.sec)
).show(truncate=False)
spark.conf.unset("spark.sql.session.timeZone")