Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Extracts json object from a json string based on json path specified, and returns json string of the extracted json object. It will return null if the input json string is invalid.
Syntax
from pyspark.sql import functions as sf
sf.get_json_object(col, path)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
String column in json format. |
path |
str | Path to the json object to extract. |
Returns
pyspark.sql.Column: string representation of given JSON object value.
Examples
Example 1: Extract a json object from json string
from pyspark.sql import functions as sf
data = [("1", '''{"f1": "value1", "f2": "value2"}'''), ("2", '''{"f1": "value12"}''')]
df = spark.createDataFrame(data, ("key", "jstring"))
df.select(df.key,
sf.get_json_object(df.jstring, '$.f1').alias("c0"),
sf.get_json_object(df.jstring, '$.f2').alias("c1")
).show()
+---+-------+------+
|key| c0| c1|
+---+-------+------+
| 1| value1|value2|
| 2|value12| NULL|
+---+-------+------+
Example 2: Extract a json object from json array
from pyspark.sql import functions as sf
data = [
("1", '''[{"f1": "value1"},{"f1": "value2"}]'''),
("2", '''[{"f1": "value12"},{"f2": "value13"}]''')
]
df = spark.createDataFrame(data, ("key", "jarray"))
df.select(df.key,
sf.get_json_object(df.jarray, '$[0].f1').alias("c0"),
sf.get_json_object(df.jarray, '$[1].f2').alias("c1")
).show()
+---+-------+-------+
|key| c0| c1|
+---+-------+-------+
| 1| value1| NULL|
| 2|value12|value13|
+---+-------+-------+
df.select(df.key,
sf.get_json_object(df.jarray, '$[*].f1').alias("c0"),
sf.get_json_object(df.jarray, '$[*].f2').alias("c1")
).show()
+---+-------------------+---------+
|key| c0| c1|
+---+-------------------+---------+
| 1|["value1","value2"]| NULL|
| 2| "value12"|"value13"|
+---+-------------------+---------+