Hi,
You can try the below code - (I am assuming, the files are under test/file folder)
final_files_list=[]
file_list = dbutils.fs.ls("/mnt/mount/test/file")
for file in file_list:
final_files_list.append(file.name)
df = spark.read.load(path=final_files_list,format="csv", sep=",", inferSchema="true", header="true")
df.createOrReplaceTempView("TestFiles")
We are creating a pyspark data frame from the given csv files, and convert that to a temp view called TestFiles.
Please try and let us know for questions.
Thanks