Azure Data Factory Custom Logging

Smitha Krishna Murthy 1 Reputation point
2022-09-28T07:53:39.92+00:00

I have an Azure Data Pipeline which has a data flow as below. I need to get the count of records processed in each of these steps so that I can write it to a custom table in Snowflake. How do I achieve this?
245350-capture.png

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,623 questions
{count} votes

1 answer

Sort by: Most helpful
  1. KranthiPakala-MSFT 46,642 Reputation points Microsoft Employee Moderator
    2022-09-29T00:52:23.693+00:00

    Hello @Smitha Krishna Murthy ,

    Thanks for the question and using MS Q&A platform.

    As per my understanding you would like to get the count of records for desired transformations(stream) and log them to your desired sink. In order to get the count of a transformation output you will have to create a new branch (nothing but a new stream) and then add an Aggregate transformation and leave group by property as empty as it is optional and then in the aggregate configuration, give a column name as rowCountOfX (just an example) and use count() in the expression box.

    Please see below example for reference:

    245804-image.png

    245815-image.png

    Once you have the row count details, then have a sink transformation with your desired data store and map the column to store the row count data.

    To explore more about aggregate transformation please refer to this video: Aggregate Transformation in Mapping Data Flow in Azure Data Factory

    Hope this will help. Please let us know if any further queries.

    ------------------------------

    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.