Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Parses the col with the format to a timestamp. The function always returns null on an invalid input with/without ANSI SQL mode enabled. The result data type is consistent with the value of configuration spark.sql.timestampType.
For the corresponding Databricks SQL function, see try_to_timestamp function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.try_to_timestamp(col=<col>, format=<format>)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
column values to convert. |
format |
literal string, optional |
format to use to convert timestamp values. |
Examples
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('1997-02-28 10:30:00',)], ['t'])
df.select(dbf.try_to_timestamp(df.t)).show()
df = spark.createDataFrame([('1997-02-28 10:30:00',)], ['t'])
df.select(dbf.try_to_timestamp(df.t, dbf.lit('yyyy-MM-dd HH:mm:ss'))).show()
origin = spark.conf.get("spark.sql.ansi.enabled")
spark.conf.set("spark.sql.ansi.enabled", "true")
try:
df = spark.createDataFrame([('malformed',)], ['t'])
df.select(dbf.try_to_timestamp(df.t)).show()
finally:
spark.conf.set("spark.sql.ansi.enabled", origin)