Ingest several types of CSV's with Databricks Auto Loader

Lima, Leonardo 1 Reputation point
2021-10-15T14:32:20.257+00:00

I'm trying to load several types of csv files using Autoloader, it currently merge all csv that I drop into a big parquet table, what I want is to create parquet tables for each type of schema/csv_file

Current code does: What I currently have
140915-stackquestion1.png

Streaming files/ waiting a file to be dropped

spark.readStream.format("cloudFiles") \
.option("cloudFiles.format", "csv") \
.option("delimiter", "~|~") \
.option("cloudFiles.inferColumnTypes","true") \
.option("cloudFiles.schemaLocation", pathCheckpoint) \
.load(sourcePath) \
.writeStream \
.format("delta") \
.option("mergeSchema", "true") \
.option("checkpointLocation", pathCheckpoint) \
.start(pathResult)

What I want enter
140933-stackquestion2.png

.NET
.NET
Microsoft Technologies based on the .NET software framework.
3,374 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,917 questions
{count} votes