Hi Yang Chow Mun,
I understood that you want to load some files on storage account data lake to be readed or mounted from Databricks.
First to write data into ADLS Gen2 using Logic Apps, you can use the Azure Blob Storage connector. Although there’s no specific connector for ADLS Gen2 in Logic Apps, ADLS Gen2 is essentially a blob container. You can use the ‘Create Blob’ task to write data into it, Secondly you need to access to this storage from Databricks either Mounting the storage account container or directly reading and writing.
- https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-azureblobstorage?tabs=consumption
- https://techcommunity.microsoft.com/t5/azure-paas-blog/mount-adls-gen2-or-blob-storage-in-azure-databricks/ba-p/3802926 / https://docs.databricks.com/en/connect/storage/azure-storage.html
On the other side about the file format to do it the Parquet file format is commonly used because it is a columnar storage file format that is optimized for use with big data processing frameworks like Apache Hadoop. Besides that there is are various formats, including CSV, JSON, and ORC.
References:
- https://stackoverflow.com/questions/56153725/is-there-a-way-to-load-data-to-azure-data-lake-storage-gen-2-using-logic-app
- https://learn.microsoft.com/en-us/azure/databricks/delta-live-tables/load
- https://techcommunity.microsoft.com/t5/fasttrack-for-azure/integrating-microsoft-fabric-with-azure-databricks-delta-tables/ba-p/3916332 If the information helped address your question, please Accept the answer.
Luis