Issue with Logging with Azure Data Factory

Krishna Nagesh Kukkadapu 21 Reputation points
2020-06-23T10:54:15.78+00:00

Hello,
I have copy activity in azure data factory to copy flat files from a file share to azure blob storage. I would like to record the logging of all the files which are being copied and the result and the data consistency check as well.

I tried enabling the data consistency check box and logging in the copy activity, but i was only able to visualize them but not able to record it for future use.

Can you please help how we can fix this.

Thanks
Krishna.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,623 questions
0 comments No comments
{count} votes

Accepted answer
  1. MartinJaffer-MSFT 26,236 Reputation points
    2020-06-24T18:19:03.45+00:00

    I did find a way to write the output to a file. I made use of the "Additional columns" feature of copying delimitedText dataset. See below picture.

    10555-patternwriteoutputdetail.jpg

    After completion of the main copy activity, I grab the details I want from the output you shared, and put it in a string type variable.
    Then I do another copy activity. This one takes an almost blank CSV file, and uses the "Additional colums" feature to add the variable as a new column, and write the combination to a new CSV file.

    It is possible to skip the set variable step, and directly reference the details, but I find the set variable makes for easier debugging.
    Which expression to use for getting the details depends upon what you want to capture.
    If you wanted to capture only the data verification, it could look like @{activity('Copy data').output.dataConsistencyVerification}
    You probably also want to capture the activityRunId or PipelineRunId @{activity('Copy data').activityRunId}


2 additional answers

Sort by: Most helpful
  1. Vaibhav Chaudhari 38,916 Reputation points Volunteer Moderator
    2020-06-23T14:03:03.053+00:00

    In my opinion, logging functionality only logs the incompatible rows or files which were not accessible at source - in shorts - it logs files details that were skipped to copy.

    By default, the Copy activity stops copying data and returns a failure when source data rows are incompatible with sink data rows. To make the copy succeed, you can configure the Copy activity to skip and log the incompatible rows and copy only the compatible data

    https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-overview#fault-tolerance

    ----------

    If the response helped, do "Accept Answer" and upvote it - Vaibhav


  2. dgreat juan 1 Reputation point
    2021-02-10T23:26:59.61+00:00

    you can store this json info into a variable type array then pass it to a web app as a body. the web app url should be the url of the logic app url. Then in Logic app you can write it into a csv file and output to a blob.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.