Data Factory copy data to another container

Alex Corovin 46 Reputation points
2020-09-17T23:23:48.12+00:00

Hi Everyone,

Could you help me please

In Data Factory I created Copy Data pipelines from one Storage account (Containers) to another Storage account (Containers), copy working fine , could tell me please how I can configured to keep every time new copy , for example at the end of the month I would like have around 31 copy and delete the copy older than 3 month

thank you very much,

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,692 questions
0 comments No comments
{count} votes

Accepted answer
  1. HarithaMaddi-MSFT 10,136 Reputation points
    2020-09-18T12:25:09.147+00:00

    Hi @Alex Corovin ,

    Welcome to Microsoft Q&A Platform. Thanks for posting the query.

    This can be complicated if there are several nested folders inside a container and they are dynamic. But if the hierarchy is fixed in both source and destination, this is simple. If nested folders, we need to implement in multiple pipelines and reuse them in the main pipeline to loop through each one of them for files.

    Core part of the requirement - deletion of files if it is older than a month can be achieved using foreach, get metadata, if condition and delete activities as below where last modified date of a file can be obtained from "Get metadata" activity and it can be compared with current date using IF Condition activity expression. If true, delete can be performed on that file.

    25767-deletefilesageadf.gif

    So, this pipeline has to be triggered from parent pipeline after checking the folder names and file names as we should be finally evaluating condition only on the files. This design can be as below

    25746-parentpipelineadf.gif

    Attaching the JSON code for both the pipelines above for reference. This can be done in a single pipeline as well if the hierarchy is fixed and with multiple foreach activities. If there are more nested layers and dynamic, we need to implement one more pipeline to dynamically form the folder path and pass filenames to the "delete pipeline".

    Hope this helps! Please let us know if it does not align with the requirement or for further queries and we will be glad to assist.

    Ref: delete-nested-date-folder-getdate-5-date

    25610-deletechildpipelinejson.txt
    25747-recursviefilecheckjson.txt


0 additional answers

Sort by: Most helpful