Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Calculates the cyclic redundancy check value (CRC32) of a binary column and returns the value as a bigint. Supports Spark Connect.
For the corresponding Databricks SQL function, see crc32 function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.crc32(col=<col>)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
Target column to compute on. |
Returns
pyspark.sql.Column: the column for computed results.
Examples
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('ABC',)], ['a'])
df.select('*', dbf.crc32('a')).show(truncate=False)
+---+----------+
|a |crc32(a) |
+---+----------+
|ABC|2743272264|
+---+----------+