Piezīmes
Lai piekļūtu šai lapai, ir nepieciešama autorizācija. Varat mēģināt pierakstīties vai mainīt direktorijus.
Lai piekļūtu šai lapai, ir nepieciešama autorizācija. Varat mēģināt mainīt direktorijus.
Collection function: Returns element of array at given (1-based) index or value for given key in a map. For arrays, if index is 0, Spark will throw an error. If index < 0, accesses elements from the last to the first. The function always returns NULL if the index exceeds the length of the array. For maps, the function always returns NULL if the key is not contained in the map.
For the corresponding Databricks SQL function, see try_element_at function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.try_element_at(col=<col>, extraction=<extraction>)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
Name of column containing array or map. |
extraction |
pyspark.sql.Column or str |
Index to check for in array or key to check for in map. |
Examples
Example 1: Getting the first element of an array
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([(["a", "b", "c"],)], ['data'])
df.select(dbf.try_element_at(df.data, dbf.lit(1))).show()
+-----------------------+
|try_element_at(data, 1)|
+-----------------------+
| a|
+-----------------------+
Example 2: Getting the last element of an array using negative index
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([(["a", "b", "c"],)], ['data'])
df.select(dbf.try_element_at(df.data, dbf.lit(-1))).show()
+------------------------+
|try_element_at(data, -1)|
+------------------------+
| c|
+------------------------+