Hello Shambhu Rai
You can load Json file using spark.read.format
Here is an example from the official documentation page.
`{"string":"string1","int":1,"array":[1,2,3],"dict": {"key": "value1"}} {"string":"string2","int":2,"array":[2,4,6],"dict": {"key": "value2"}} {"string":"string3","int":3,"array":[3,6,9],"dict": {"key": "value3", "extra_key": "extra_value3"}}
val df = spark.read.format("json").load("example.json")
df.printSchema`
Reference document:
https://docs.databricks.com/en/query/formats/json.html
I hope this answers your question.
If this answers your question, please consider accepting the answer by hitting the Accept answer and up-vote as it helps the community look for answers to similar questions.