Bagikan melalui


map_concat

Mengembalikan persatuan semua peta yang diberikan. Untuk kunci duplikat dalam peta input, penanganan diatur oleh spark.sql.mapKeyDedupPolicy. Secara default, ini melemparkan pengecualian. Jika diatur ke LAST_WIN, nilai peta terakhir akan digunakan.

Syntax

from pyspark.sql import functions as sf

sf.map_concat(*cols)

Parameter-parameternya

Pengaturan Tipe Description
cols pyspark.sql.Column atau str Nama kolom atau Kolom

Pengembalian Barang

pyspark.sql.Column: Peta entri gabungan dari peta lain.

Examples

Contoh 1: Penggunaan dasar map_concat

from pyspark.sql import functions as sf
df = spark.sql("SELECT map(1, 'a', 2, 'b') as map1, map(3, 'c') as map2")
df.select(sf.map_concat("map1", "map2")).show(truncate=False)
+------------------------+
|map_concat(map1, map2)  |
+------------------------+
|{1 -> a, 2 -> b, 3 -> c}|
+------------------------+

Contoh 2: map_concat dengan tiga peta

from pyspark.sql import functions as sf
df = spark.sql("SELECT map(1, 'a') as map1, map(2, 'b') as map2, map(3, 'c') as map3")
df.select(sf.map_concat("map1", "map2", "map3")).show(truncate=False)
+----------------------------+
|map_concat(map1, map2, map3)|
+----------------------------+
|{1 -> a, 2 -> b, 3 -> c}    |
+----------------------------+

Contoh 3: map_concat dengan peta kosong

from pyspark.sql import functions as sf
df = spark.sql("SELECT map(1, 'a', 2, 'b') as map1, map() as map2")
df.select(sf.map_concat("map1", "map2")).show(truncate=False)
+----------------------+
|map_concat(map1, map2)|
+----------------------+
|{1 -> a, 2 -> b}      |
+----------------------+

Contoh 4: map_concat dengan nilai null

from pyspark.sql import functions as sf
df = spark.sql("SELECT map(1, 'a', 2, 'b') as map1, map(3, null) as map2")
df.select(sf.map_concat("map1", "map2")).show(truncate=False)
+---------------------------+
|map_concat(map1, map2)     |
+---------------------------+
|{1 -> a, 2 -> b, 3 -> NULL}|
+---------------------------+