Kopīgot, izmantojot


transform_keys

Applies a function to every key-value pair in a map and returns a map with the results of those applications as the new keys for the pairs. Supports Spark Connect.

For the corresponding Databricks SQL function, see transform_keys function.

Syntax

from pyspark.databricks.sql import functions as dbf

dbf.transform_keys(col=<col>, f=<f>)

Parameters

Parameter Type Description
col pyspark.sql.Column or str Name of column or expression.
f function A binary function.

Returns

pyspark.sql.Column: a new map of entries where new keys were calculated by applying given function to each key value argument.

Examples

from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([(1, {"foo": -2.0, "bar": 2.0})], ("id", "data"))
row = df.select(dbf.transform_keys(
    "data", lambda k, _: dbf.upper(k)).alias("data_upper")
).head()
sorted(row["data_upper"].items())
[('BAR', 2.0), ('FOO', -2.0)]