Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Returns the decompressed value of expr using Zstandard. Supports data compressed in both single-pass mode and streaming mode. On decompression failure, it throws an exception.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.zstd_decompress(input=<input>)
Parameters
| Parameter | Type | Description |
|---|---|---|
input |
pyspark.sql.Column or str |
The binary value to decompress. |
Returns
pyspark.sql.Column: A new column that contains an uncompressed value.
Examples
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([("KLUv/SCCpQAAaEFwYWNoZSBTcGFyayABABLS+QU=",)], ["input"])
df.select(dbf.zstd_decompress(dbf.unbase64(df.input)).cast("string").alias("result")).show(truncate=False)
+----------------------------------------------------------------------------------------------------------------------------------+
|result |
+----------------------------------------------------------------------------------------------------------------------------------+
|Apache Spark Apache Spark Apache Spark Apache Spark Apache Spark Apache Spark Apache Spark Apache Spark Apache Spark Apache Spark |
+----------------------------------------------------------------------------------------------------------------------------------+