ADF logging mechanism

azure_learner 260 Reputation points
2024-09-20T14:35:25.8066667+00:00

I have a few options to get logging from Azure data factory for performing key metric analysis such as

  1. Having SQL table store procedures and capture all key metrics from Azure monitor such pipeline runtimes and durations
  2. Programmatically grap these through Azure functions or Powershell and CLI.
  3. Enable diagnostic settings and capture all logging through log analytics

Out of these, which is a cost-effective, scalable, and enhanced strategy. Please suggest use-cases and pros and cons. Thank you

Azure Monitor
Azure Monitor
An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
3,273 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,708 questions
0 comments No comments
{count} votes

Accepted answer
  1. Nandan Hegde 32,421 Reputation points MVP
    2024-09-20T15:06:01.3466667+00:00

    The ideal strategy depends on the below set of factors:

    1. What are the metrics that you are trying to capture? Are those like Pipelinerunid, starttime,endtime,datarows,datasize etc? Or are those at each and every activity level metrics?

    Because in case if those are the initial ones, you can have a stored procedure activity within the main pipeline itself and pass the pipeline parameters/expression to capture those metrics directly.

    1. How long do you want to retain the data? Because diagnostic setting/log analytics based on what I remember has a fixed retention period post which it cleans up historical records, so in that case having a custom SQL table of your own helps you govern the retention time period.
    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.