Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Returns true if the map contains the key.
Syntax
from pyspark.sql import functions as sf
sf.map_contains_key(col, value)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
The name of the column or an expression that represents the map. |
value |
Any | A literal value, or a Column expression. |
Returns
pyspark.sql.Column: True if key is in the map and False otherwise.
Examples
Example 1: The key is in the map
from pyspark.sql import functions as sf
df = spark.sql("SELECT map(1, 'a', 2, 'b') as data")
df.select(sf.map_contains_key("data", 1)).show()
+-------------------------+
|map_contains_key(data, 1)|
+-------------------------+
| true|
+-------------------------+
Example 2: The key is not in the map
from pyspark.sql import functions as sf
df = spark.sql("SELECT map(1, 'a', 2, 'b') as data")
df.select(sf.map_contains_key("data", -1)).show()
+--------------------------+
|map_contains_key(data, -1)|
+--------------------------+
| false|
+--------------------------+
Example 3: Check for key using a column
from pyspark.sql import functions as sf
df = spark.sql("SELECT map(1, 'a', 2, 'b') as data, 1 as key")
df.select(sf.map_contains_key("data", sf.col("key"))).show()
+---------------------------+
|map_contains_key(data, key)|
+---------------------------+
| true|
+---------------------------+