Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Returns the day of the week for date/timestamp (0 = Monday, 1 = Tuesday, ..., 6 = Sunday).
For the corresponding Databricks SQL function, see weekday function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.weekday(col=<col>)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
target date/timestamp column to work on. |
Returns
pyspark.sql.Column: the day of the week for date/timestamp (0 = Monday, 1 = Tuesday, ..., 6 = Sunday).
Examples
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('2015-04-08',), ('2024-10-31',)], ['dt'])
df.select("*", dbf.typeof('dt'), dbf.weekday('dt')).show()
df = spark.createDataFrame([('2015-04-08 13:08:15',), ('2024-10-31 10:09:16',)], ['ts'])
df.select("*", dbf.typeof('ts'), dbf.weekday('ts')).show()
import datetime
df = spark.createDataFrame([
(datetime.date(2015, 4, 8),),
(datetime.date(2024, 10, 31),)], ['dt'])
df.select("*", dbf.typeof('dt'), dbf.weekday('dt')).show()
import datetime
df = spark.createDataFrame([
(datetime.datetime(2015, 4, 8, 13, 8, 15),),
(datetime.datetime(2024, 10, 31, 10, 9, 16),)], ['ts'])
df.select("*", dbf.typeof('ts'), dbf.weekday('ts')).show()