You can use ADF mapping dataflow to copy data to and from a delta lake stored in Azure data lake using the delta format. Delta is only available as an inline dataset and, by default, doesn't have an associated schema.
High level steps:
create linked service and dataset that connect to your data source (data lake) where csv is stored
delta table as sink
To get column metadata, click the Import schema button in the Projection tab. This will allow you to reference the column names and data types specified by the corpus. To import the schema, a data flow debug session must be active and you must have an existing CDM entity definition file to point to
To use ADF to load files into a delta table where the columns are not static and dynamically changing every time, you can use the delta source script example provided in the below document. The delta source script allows you to specify the column names and data types for the source data. You can modify the script to match the column names and data types in your .csv file.
I hope this helps! Let me know if you have any further questions.
Please see the below document and demonstration video in the document.
https://learn.microsoft.com/en-us/azure/data-factory/format-delta
I hope this helps. Please let us know if you have any further questions.