Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Create local date-time from years, months, days, hours, mins, secs fields. Alternatively, try to create local date-time from date and time fields. If the configuration spark.sql.ansi.enabled is false, the function returns NULL on invalid inputs. Otherwise, it will throw an error.
Syntax
import pyspark.sql.functions as sf
# From individual components
sf.make_timestamp_ntz(years=<years>, months=<months>, days=<days>, hours=<hours>, mins=<mins>, secs=<secs>)
# From date and time
sf.make_timestamp_ntz(date=<date>, time=<time>)
Parameters
| Parameter | Type | Description |
|---|---|---|
years |
pyspark.sql.Column or str |
The year to represent, from 1 to 9999. Required when creating timestamps from individual components. Must be used with months, days, hours, mins, and secs. |
months |
pyspark.sql.Column or str |
The month-of-year to represent, from 1 (January) to 12 (December). Required when creating timestamps from individual components. Must be used with years, days, hours, mins, and secs. |
days |
pyspark.sql.Column or str |
The day-of-month to represent, from 1 to 31. Required when creating timestamps from individual components. Must be used with years, months, hours, mins, and secs. |
hours |
pyspark.sql.Column or str |
The hour-of-day to represent, from 0 to 23. Required when creating timestamps from individual components. Must be used with years, months, days, mins, and secs. |
mins |
pyspark.sql.Column or str |
The minute-of-hour to represent, from 0 to 59. Required when creating timestamps from individual components. Must be used with years, months, days, hours, and secs. |
secs |
pyspark.sql.Column or str |
The second-of-minute and its micro-fraction to represent, from 0 to 60. The value can be either an integer like 13, or a fraction like 13.123. If the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp. Required when creating timestamps from individual components. Must be used with years, months, days, hours, and mins. |
date |
pyspark.sql.Column or str |
The date to represent, in valid DATE format. Required when creating timestamps from date and time components. Must be used with time parameter only. |
time |
pyspark.sql.Column or str |
The time to represent, in valid TIME format. Required when creating timestamps from date and time components. Must be used with date parameter only. |
Returns
pyspark.sql.Column: A new column that contains a local date-time.
Examples
Example 1: Make local date-time from years, months, days, hours, mins, secs.
import pyspark.sql.functions as sf
df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887]],
['year', 'month', 'day', 'hour', 'min', 'sec'])
df.select(
sf.make_timestamp_ntz('year', 'month', df.day, df.hour, df.min, df.sec)
).show(truncate=False)
+----------------------------------------------------+
|make_timestamp_ntz(year, month, day, hour, min, sec)|
+----------------------------------------------------+
|2014-12-28 06:30:45.887 |
+----------------------------------------------------+
Example 2: Make local date-time from date and time.
import pyspark.sql.functions as sf
from datetime import date, time
df = spark.range(1).select(
sf.lit(date(2014, 12, 28)).alias("date"),
sf.lit(time(6, 30, 45, 887000)).alias("time")
)
df.select(sf.make_timestamp_ntz(date=df.date, time=df.time)).show(truncate=False)
+------------------------------+
|make_timestamp_ntz(date, time)|
+------------------------------+
|2014-12-28 06:30:45.887 |
+------------------------------+