Fault tolerance in in Copy activity

Poel van der, RE (Ron) 421 Reputation points

Hi copying a Parquet file into a SQLDB file using Copy activity and fault tolerance. Skipped rows are written to log. We have a steering table having an attribute containing the max number of skipped records. If the number of skipped is bigger than the max, the pipeline should fail or the activity may stop ingesting further data. This can be handy by very high volumes. In the current situation you have to process everything before you get the SkippedRows count. But is this possible what I am suggesting? Regards Ron

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,693 questions
{count} votes

Accepted answer
  1. KranthiPakala-MSFT 46,422 Reputation points Microsoft Employee

    Hi @Poel van der, RE (Ron) ,

    I have verified with product team and confirmed that currently ADF doesn't support stopping the run when a skipped row threshold is hit. I would recommend you to please share your feedback/suggestion in ADF user voice forum: https://feedback.azure.com/forums/270578-data-factory

    All the feedback shared in this forum are actively monitored and reviewed by ADF engineering team. Also please do share the feedback link once it is posted, so that other users with similar idea can up-vote and/or comment on you suggestion, which would help increase priority of the feature request.

    Hope this info helps.


    Thank you
    Please do consider to click on "Accept Answer" and "Upvote" on the post that helps you, as it can be beneficial to other community members.

    Announcement : Want to participate in Azure Data Factory hackathon'21? Come hack with us :) - https://aka.ms/adfhack21 (Note: Submission Deadline 24th Feb'21)

    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful