opomba,
Dostop do te strani zahteva pooblastilo. Poskusite se vpisati alispremeniti imenike.
Dostop do te strani zahteva pooblastilo. Poskusite lahko spremeniti imenike.
Returns an unordered array of all entries in the given map.
Syntax
from pyspark.sql import functions as sf
sf.map_entries(col)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
Name of column or expression |
Returns
pyspark.sql.Column: An array of key value pairs as a struct type
Examples
Example 1: Extracting entries from a simple map
from pyspark.sql import functions as sf
df = spark.sql("SELECT map(1, 'a', 2, 'b') as data")
df.select(sf.sort_array(sf.map_entries("data"))).show()
+-----------------------------------+
|sort_array(map_entries(data), true)|
+-----------------------------------+
| [{1, a}, {2, b}]|
+-----------------------------------+
Example 2: Extracting entries from a map with complex keys and values
from pyspark.sql import functions as sf
df = spark.sql("SELECT map(array(1, 2), array('a', 'b'), "
"array(3, 4), array('c', 'd')) as data")
df.select(sf.sort_array(sf.map_entries("data"))).show(truncate=False)
+------------------------------------+
|sort_array(map_entries(data), true) |
+------------------------------------+
|[{[1, 2], [a, b]}, {[3, 4], [c, d]}]|
+------------------------------------+
Example 3: Extracting entries from an empty map
from pyspark.sql import functions as sf
df = spark.sql("SELECT map() as data")
df.select(sf.map_entries("data")).show()
+-----------------+
|map_entries(data)|
+-----------------+
| []|
+-----------------+