Exploring Azure Data Factory: Questions on ETL Process Failures and Logging

Kyle1245 420 Reputation points
2024-02-22T02:32:34.5666667+00:00

Hello, as a newcomer to Azure Data Factory, I'm currently setting up ETLs to transfer data to a SQL Server OnPremise database. While working with Azure Data Factory, I've encountered two issues, and I'm uncertain whether they stem from configuration problems or inherent limitations of the tool: When an ETL record fails, the process halts, and ultimately no data is transferred to my destination database. Is this behavior expected? There seems to be a lack of comprehensive logs or traces for the ETL process. Do I need to utilize a separate monitoring tool for this purpose?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,350 questions
0 comments No comments
{count} votes

Accepted answer
  1. phemanth 14,640 Reputation points Microsoft External Staff
    2024-02-23T08:57:25.1566667+00:00

    @HideyoshiAkaio
    Thanks for reaching out to Microsoft Q&A.

    ETL Record Failure: Azure Data Factory (ADF) allows conditional logic in its pipeline orchestration, enabling different paths based on the outcomes of a previous activity. This includes error handling in ETL/ELT logic. If an activity fails, you can define a path to be executed. There are different approaches to handle errors, such as the Try-Catch block and the Do-If-Else block. The overall pipeline’s success or failure depends on the error handling mechanism you choose.

     refer:https://learn.microsoft.com/en-us/azure/data-factory/tutorial-pipeline-failure-error-handling

    Logging and Monitoring: ADF can write diagnostic logs in Azure Monitor. Azure Monitor provides base-level infrastructure metrics and logs for most Azure services3. You can use Azure Monitor if you want to keep pipeline-run data for a longer time. It also allows you to route diagnostic logs for analysis to multiple different targets. Additionally, the schema used by ADF logs and events for monitoring is described in detail.

    refer:https://learn.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


1 additional answer

Sort by: Most helpful
  1. Nandan Hegde 34,426 Reputation points MVP
    2024-02-22T03:55:29.67+00:00

    You can use Fault tolerance of Copy activity to proceed with the below 2 aspects :

    1. Skip incompatible rows and proceed with copy
    2. Log in incompatible records

    MSFT Doc :

    https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-fault-tolerance

    But irrespective of fault tolerance, Copy activity is not transaction bound.

    I would copy amount of data till the point of failure (in case if fault tolerance is not enabled)

    So not sure what you mean by no data transferred between systems ? Can you provide more details as to what is the source and sink and your flow ?

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.