Unable to view parquet files

Abhishek Gaikwad 191 Reputation points
2021-02-26T14:55:58.643+00:00

When i run the below code in scala in databricks the code runs successfully. I am able to read the file back from the location.
However when i run the display(dbutils.fs.ls ("mnt/Datalake3/feature/")) I cannot see any of the parquet file.
Also I am not able to locate the files through storage account on portal and cannot view those parquet files through storage data explorer.
I am writing to datalake gen2.

![val someDF = Seq(  
  (8, "bat"),  
  (64, "mouse"),  
  (-27, "horse")  
).toDF("number", "word")  
  
  
someDF.write.mode("overwrite").parquet("/dbfs/mnt/Datalake3/feature/test1.parquet")  
  
  
val data =sqlContext.read.parquet("/dbfs/mnt/Datalake3/feature/test1.parquet")  
display(data)  
  
  
dbutils.fs.ls("abfss://demo@lakgen2.dfs.core.windows.net/")][1]  
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,080 questions
0 comments No comments
{count} votes

Accepted answer
  1. Vaibhav Chaudhari 38,686 Reputation points
    2021-02-26T15:14:23.89+00:00

    If you want to store the data in mounted ADLS Gen2, just give path like /mnt/mountname/whateverfolder

    someDF.write.mode("overwrite").parquet("/mnt/Datalake3/feature/test1.parquet")  
    

    I removed /dbfs/ from above path. Try and see if it works

    The data that is getting stored with your current code should be in internal ADB storage here:

    72469-image.png

    ----------

    Please don't forget to Accept Answer and Up-vote if the response helped -- Vaibhav


0 additional answers

Sort by: Most helpful