Send Data Factory resource logs to a custom table in Log Analytics Workspace

Saif Alramahi 0 Reputation points
2023-05-10T12:09:18.5566667+00:00

Hello,

I'm implementing Azure Log Analytics Workspace as a central log store in my architecture for data observability. Currently, the sources are mainly Azure Data Factory and external applications.

I know its possible to send ADF resource logs to log analytics workspace through diagnostic settings, landing data in Azure tables (e.g. ADFPipelineRuns, ADFActivityRuns). It's also possible to send custom logs from external sources via REST API and this would land the data in custom log tables with suffix _CL.

My end goal is to have two main tables in log analytics workspace, "Executions__CL"_ for all my execution logs and "Activities_CL" for all my activity logs.

Is there a way to send ADF pipeline run and activity run log data to preferred custom tables (Executions__CL_ and Activities_CL) instead of ADFPipelineRuns and ADFActivityRuns. This way I can send other execution and activity log data from external sources to the same tables.

I have worked on a method to construct pipeline and activity run logs within ADF and send them via HTTP to a DCE but this requires a lot of development work on every pipeline. I need a method to leverage what diagnostic settings is already doing but to send my data to custom tables instead of Azure tables.

Thanks!

Azure Monitor
Azure Monitor
An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
2,988 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,084 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. dashanan13 930 Reputation points
    2023-05-10T13:01:51.2933333+00:00

    @Saif Alramahi Thank you for reaching out to Microsoft Q and A.

    As i understand you need ot send data to a custom Log analytics workspace table.

    It is possible to do it via Log ingestion API and here is a tutorial to help you out with steps.

    https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal

    Please "mark it as answer" if it helps.


  2. AnuragSingh-MSFT 21,241 Reputation points
    2023-05-11T10:41:31.6933333+00:00

    @Saif Alramahi , thank you for posting this question on Microsoft Q&A. I understand that you are trying to send ADF logs to specific tables in Log Analytics workspace.

    By default, there are limited options to specify the destination table where the logs will be exported using the Diagnostic settings, when using the "Send to Log Analytics workspace" option, as shown below:

    User's image

    For more details on these options available, please see Send to Log Analytics workspace.

    Coming to your requirement, you would have to use some customization/custom solution to categorize logs in specific custom tables. In this regard, the option to "Stream to an event hub" option will be a better option (instead of forwarding logs to LA workspace directly.

    User's image

    Doing so, you can use Azure Function with Event Hub trigger, where you can add some logic to write logs to custom table as they are being forwarded. For more details, see Azure Event Hubs trigger for Azure Functions

    Azure Functions is a serverless solution that allows you to write less code, maintain less infrastructure. Using this solution, only writing to destination will need to be coded - polling/trigger will be taken care by Event Hub and FunctionApp's trigger.

    Alternatively, if the logs are available in LA workspace (by using the default option of streaming logs to LA in Diagnostic Settings), you can write custom functions to query only what you need. For details, see Use a function in LA Workspace.

    Functions would behave similar to querying from custom log table and will also provide flexibility to update it based on future requirement. This is the easiest, almost-no-maintenance requiring method to achieve the end goal in question. Because, in future if the schema of data being streamed from ADF changes, the data would still be in LA and only the LA-function code might need change. However, if you use EventHub or other custom solutions, you will have to ensure that the changes are also replicated in the custom solution.

    Hope this helps. If you have any questions, please let us know.

    If the answer helped, please click Accept answer so that it can help others in the community looking for help on similar topics.