Share via


make_dt_interval

Make DayTimeIntervalType duration from days, hours, mins and secs.

For the corresponding Databricks SQL function, see make_dt_interval function.

Syntax

from pyspark.databricks.sql import functions as dbf

dbf.make_dt_interval(days=<days>, hours=<hours>, mins=<mins>, secs=<secs>)

Parameters

Parameter Type Description
days pyspark.sql.Column or str, optional The number of days, positive or negative.
hours pyspark.sql.Column or str, optional The number of hours, positive or negative.
mins pyspark.sql.Column or str, optional The number of minutes, positive or negative.
secs pyspark.sql.Column or str, optional The number of seconds with the fractional part in microsecond precision.

Returns

pyspark.sql.Column: A new column that contains a DayTimeIntervalType duration.

Examples

from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([[1, 12, 30, 01.001001]], ['day', 'hour', 'min', 'sec'])
df.select('*', dbf.make_dt_interval(df.day, df.hour, df.min, df.sec)).show(truncate=False)
df = spark.createDataFrame([[1, 12, 30, 01.001001]], ['day', 'hour', 'min', 'sec'])
df.select('*', dbf.make_dt_interval(df.day, 'hour', df.min)).show(truncate=False)
df = spark.createDataFrame([[1, 12, 30, 01.001001]], ['day', 'hour', 'min', 'sec'])
df.select('*', dbf.make_dt_interval(df.day, df.hour)).show(truncate=False)
df = spark.createDataFrame([[1, 12, 30, 01.001001]], ['day', 'hour', 'min', 'sec'])
df.select('*', dbf.make_dt_interval('day')).show(truncate=False)
spark.range(1).select(dbf.make_dt_interval()).show(truncate=False)