Note
Ang pag-access sa pahinang ito ay nangangailangan ng pahintulot. Maaari mong subukang mag-sign in o magpalit ng mga direktoryo.
Ang pag-access sa pahinang ito ay nangangailangan ng pahintulot. Maaari mong subukang baguhin ang mga direktoryo.
Returns a DataFrame containing a new row for each element with position in the given array or map. Unlike explode, if the array/map is null or empty then null is produced. Uses the default column name col for elements in the array and key and value for elements in the map unless specified otherwise.
Syntax
spark.tvf.explode_outer(collection)
Parameters
| Parameter | Type | Description |
|---|---|---|
collection |
pyspark.sql.Column |
Target column to work on. |
Returns
pyspark.sql.DataFrame: A DataFrame with a new row for each element, or null if the collection is empty or null.
Examples
import pyspark.sql.functions as sf
spark.tvf.explode_outer(sf.array(sf.lit("foo"), sf.lit("bar"))).show()
+---+
|col|
+---+
|foo|
|bar|
+---+
import pyspark.sql.functions as sf
spark.tvf.explode_outer(sf.array()).show()
+----+
| col|
+----+
|NULL|
+----+
import pyspark.sql.functions as sf
spark.tvf.explode_outer(sf.create_map(sf.lit("x"), sf.lit(1.0))).show()
+---+-----+
|key|value|
+---+-----+
| x| 1.0|
+---+-----+
import pyspark.sql.functions as sf
spark.tvf.explode_outer(sf.create_map()).show()
+----+-----+
| key|value|
+----+-----+
|NULL| NULL|
+----+-----+