Data Factory Logs to Databricks

Yohanna de Oliveira Cavalcanti 160 Reputation points
2024-07-16T19:26:32.8633333+00:00

I need to create a way to send logs from Data Factory to the Databricks Catalog. What is the most cost-effective and efficient method to achieve this?Databricks Unity Catalog for Unified Data Governance

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,065 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,106 questions
0 comments No comments
{count} votes

Accepted answer
  1. Azar 21,800 Reputation points MVP
    2024-07-16T19:34:56.8433333+00:00

    Hi there Yohanna de Oliveira Cavalcanti

    Thanks for using QandA platform

    start by configuring diagnostic settings in Data Factory to export logs to an Azure Storage Account. Then, use Databricks' Auto Loader to incrementally read these logs from Azure Storage. Finally, create managed tables in Unity Catalog and set up an ETL pipeline in Databricks to process and write the logs into these tables. by t doing his tyhe Azure Storage for low-cost log export and Databricks Auto Loader for efficient data ingestion, making sure a scalable solution.

    https://learn.microsoft.com/en-us/azure/data-factory/monitor-data-factory

    https://learn.microsoft.com/en-us/azure/databricks/ingestion/auto-loader/

    https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/

    If this helps kindly accept the response thanks much.

    You found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. Deleted

    This answer has been deleted due to a violation of our Code of Conduct. The answer was manually reported or identified through automated detection before action was taken. Please refer to our Code of Conduct for more information.


    Comments have been turned off. Learn more