Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Extract the week number of a given date as integer. A week is considered to start on a Monday and week 1 is the first week with more than 3 days, as defined by ISO 8601
For the corresponding Databricks SQL function, see weekofyear function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.weekofyear(col=<col>)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
target timestamp column to work on. |
Returns
pyspark.sql.Column: week of the year for given date as integer.
Examples
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('2015-04-08',), ('2024-10-31',)], ['dt'])
df.select("*", dbf.typeof('dt'), dbf.weekofyear('dt')).show()
df = spark.createDataFrame([('2015-04-08 13:08:15',), ('2024-10-31 10:09:16',)], ['ts'])
df.select("*", dbf.typeof('ts'), dbf.weekofyear('ts')).show()
import datetime
df = spark.createDataFrame([
(datetime.date(2015, 4, 8),),
(datetime.date(2024, 10, 31),)], ['dt'])
df.select("*", dbf.typeof('dt'), dbf.weekofyear('dt')).show()
import datetime
df = spark.createDataFrame([
(datetime.datetime(2015, 4, 8, 13, 8, 15),),
(datetime.datetime(2024, 10, 31, 10, 9, 16),)], ['ts'])
df.select("*", dbf.typeof('ts'), dbf.weekofyear('ts')).show()