Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Explodes an array of structs into a table. Unlike inline, if the array is null or empty then null is produced for each nested column.
Syntax
from pyspark.sql import functions as sf
sf.inline_outer(col)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or column name |
Input column of values to explode. |
Returns
pyspark.sql.Column: generator expression with the inline exploded result.
Examples
from pyspark.sql import functions as sf
df = spark.sql('SELECT * FROM VALUES (1,ARRAY(NAMED_STRUCT("a",1,"b",2), NULL, NAMED_STRUCT("a",3,"b",4))), (2,ARRAY()), (3,NULL) AS t(i,s)')
df.printSchema()
root
|-- i: integer (nullable = false)
|-- s: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- a: integer (nullable = false)
| | |-- b: integer (nullable = false)
df.select('*', sf.inline_outer('s')).show(truncate=False)
+---+----------------------+----+----+
|i |s |a |b |
+---+----------------------+----+----+
|1 |[{1, 2}, NULL, {3, 4}]|1 |2 |
|1 |[{1, 2}, NULL, {3, 4}]|NULL|NULL|
|1 |[{1, 2}, NULL, {3, 4}]|3 |4 |
|2 |[] |NULL|NULL|
|3 |NULL |NULL|NULL|
+---+----------------------+----+----+