Hi there Yohanna de Oliveira Cavalcanti
Thanks for using QandA platform
start by configuring diagnostic settings in Data Factory to export logs to an Azure Storage Account. Then, use Databricks' Auto Loader to incrementally read these logs from Azure Storage. Finally, create managed tables in Unity Catalog and set up an ETL pipeline in Databricks to process and write the logs into these tables. by t doing his tyhe Azure Storage for low-cost log export and Databricks Auto Loader for efficient data ingestion, making sure a scalable solution.
https://learn.microsoft.com/en-us/azure/data-factory/monitor-data-factory
https://learn.microsoft.com/en-us/azure/databricks/ingestion/auto-loader/
https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/
If this helps kindly accept the response thanks much.