Error Handling in Data Pipeline

David 60 Reputation points
2025-08-26T12:06:17.17+00:00

My Data Pipeline look like : The Highlighted one in purple.

User's image

How to Handle error if occured during the flow.

I am planning to have error table in azure sql.

trying to populate error table whenever error occured in any part of the pipeline.

May someone please help me.

Thanks a lot.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
{count} votes

1 answer

Sort by: Most helpful
  1. Kalyani Kondavaradala 4,600 Reputation points Microsoft External Staff Moderator
    2025-08-26T13:16:25.1533333+00:00

    Hi @David,

    Thanks for posting your query on Microsoft Q&A!

    I understand you’d like to capture errors that occur anywhere in your highlighted pipeline (purple flow) and write those details into a centralized error table in Azure SQL Database.

    Can you try these steps and check if its working for you.

    Create Error Logging Table in Azure SQL

    First, define a table (e.g., dbo.ErrorLog) to store error details:

    This is the sample code you can add or delete columns as per your requirement

    CREATE TABLE dbo.ErrorLog
    (
        ErrorLogID INT IDENTITY(1,1) PRIMARY KEY,
        PipelineName NVARCHAR(200),
        RunId UNIQUEIDENTIFIER,
        ActivityName NVARCHAR(200),
        ErrorMessage NVARCHAR(MAX),
        ErrorTime DATETIME2
    );
    --Create a Stored Procedure This procedure will insert an error record whenever called from ADF:
    CREATE OR ALTER PROCEDURE dbo.usp_LogPipelineError
        @pipelineName NVARCHAR(200),
        @runId UNIQUEIDENTIFIER,
        @activityName NVARCHAR(200),
        @errorMessage NVARCHAR(MAX),
        @errorTime DATETIME2
    AS
    BEGIN
        INSERT INTO dbo.ErrorLog (PipelineName, RunId, ActivityName, ErrorMessage, ErrorTime)
        VALUES (@pipelineName, @runId, @activityName, @errorMessage, @errorTime);
    END
    

    Configure Error Handling in ADF

    In your pipeline, for each critical activity (like Databricks,Copy activity whereever you want to capture logs), use the On Failure dependency.

    Connect the failure path to a Stored Procedure activity that calls usp_LogPipelineError.

    Pass system variables into the SP(stored procedure), such as:

    @pipeline().Pipeline -> pipeline name
    @pipeline().RunId -> run ID
    @activity('ActivityName').Error.Message -> error details
    @utcnow() -> timestamp
    

    Example parameter mapping in the SP activity:

    "storedProcedureParameters": {
        "pipelineName": { "value": "@pipeline().Pipeline", "type": "String" },
        "runId": { "value": "@pipeline().RunId", "type": "String" },
        "activityName": { "value": "@activity('CopyFiles').ActivityName", "type": "String" },
        "errorMessage": { "value": "@activity('CopyFiles').Error.Message", "type": "String" },
        "errorTime": { "value": "@utcnow()", "type": "String" }
    }
    

    Make sure your Linked Service to Azure SQL is correctly configured with write permissions.

    Please let us know how this works for you.

    Kindly consider upvoting the comment if the information provided is helpful. This can assist other community members in resolving similar issues.

    Thanks,

    Kalyani

    3 people found this answer helpful.
    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.