从数组中删除 null 值。
Syntax
from pyspark.sql import functions as sf
sf.array_compact(col)
参数
| 参数 | 类型 | Description |
|---|---|---|
col |
pyspark.sql.Column 或 str |
列或表达式的名称 |
退货
pyspark.sql.Column:一个新列,它是一个数组,从输入列中排除 null 值。
例子
示例 1:从简单数组中删除 null 值
from pyspark.sql import functions as sf
df = spark.createDataFrame([([1, None, 2, 3],)], ['data'])
df.select(sf.array_compact(df.data)).show()
+-------------------+
|array_compact(data)|
+-------------------+
| [1, 2, 3]|
+-------------------+
示例 2:从多个数组中删除 null 值
from pyspark.sql import functions as sf
df = spark.createDataFrame([([1, None, 2, 3],), ([4, 5, None, 4],)], ['data'])
df.select(sf.array_compact(df.data)).show()
+-------------------+
|array_compact(data)|
+-------------------+
| [1, 2, 3]|
| [4, 5, 4]|
+-------------------+
示例 3:从包含所有 null 值的数组中删除 null 值
from pyspark.sql import functions as sf
from pyspark.sql.types import ArrayType, StringType, StructField, StructType schema = StructType([StructField("data", ArrayType(StringType()), True)])
df = spark.createDataFrame([([None, None, None],)], schema)
df.select(sf.array_compact(df.data)).show()
+-------------------+
|array_compact(data)|
+-------------------+
| []|
+-------------------+
示例 4:从没有 null 值的数组中删除 null 值
from pyspark.sql import functions as sf
df = spark.createDataFrame([([1, 2, 3],)], ['data'])
df.select(sf.array_compact(df.data)).show()
+-------------------+
|array_compact(data)|
+-------------------+
| [1, 2, 3]|
+-------------------+
示例 5:从空数组中删除 null 值
from pyspark.sql import functions as sf
from pyspark.sql.types import ArrayType, StringType, StructField, StructType
schema = StructType([
StructField("data", ArrayType(StringType()), True)
])
df = spark.createDataFrame([([],)], schema)
df.select(sf.array_compact(df.data)).show()
+-------------------+
|array_compact(data)|
+-------------------+
| []|
+-------------------+