Unable to view parquet files

Abhishek Gaikwad 191 Reputation points

When i run the below code in scala in databricks the code runs successfully. I am able to read the file back from the location.
However when i run the display(dbutils.fs.ls ("mnt/Datalake3/feature/")) I cannot see any of the parquet file.
Also I am not able to locate the files through storage account on portal and cannot view those parquet files through storage data explorer.
I am writing to datalake gen2.

![val someDF = Seq(  
  (8, "bat"),  
  (64, "mouse"),  
  (-27, "horse")  
).toDF("number", "word")  
val data =sqlContext.read.parquet("/dbfs/mnt/Datalake3/feature/test1.parquet")  
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,018 questions
0 comments No comments
{count} votes

Accepted answer
  1. Vaibhav Chaudhari 38,671 Reputation points

    If you want to store the data in mounted ADLS Gen2, just give path like /mnt/mountname/whateverfolder


    I removed /dbfs/ from above path. Try and see if it works

    The data that is getting stored with your current code should be in internal ADB storage here:



    Please don't forget to Accept Answer and Up-vote if the response helped -- Vaibhav

0 additional answers

Sort by: Most helpful