Azure Data Factory - Copy files from one folder to another folder.

Sowmya Thota 1 Reputation point
2022-11-15T07:01:30.627+00:00

I want to copy files from one specified path to another specific path in datalake using ADF pipelines.
Ex: source: demo/Azure/Data
Sink:demo/Azure/Destination

In Data folder I have 2 or 3 folders and in those folders I have my files .
--> Here I don't want to copy all folders which are there in the Data.I just want to copy the files which are present in those folders to the sink

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,977 questions
{count} votes

1 answer

Sort by: Most helpful
  1. AnnuKumari-MSFT 33,636 Reputation points Microsoft Employee
    2022-11-15T16:50:39.297+00:00

    Hi @Sowmya Thota ,

    Welcome to Microsoft Q&A platform and thanks for posting your question here.

    As per my understanding, you want to copy files from specific folders of ADLS to another folder within ADLS using ADF pipelines. Please let me know if my understanding is incorrect.

    You can try one of the below approaches:

    Approach 1:
    Drag a copy activity in ADF pipeline and in source settings, select 'wildcard file path' in 'File path type'. If the folders from which you want to copy the data have some common prefix like 'adls_folder1' , 'adls_folder2' etc, then in wildcard folder path provide this value: adls*/ , if you want to copy all csv files then provide wildcard filename as : '.csv' else provide ''

    260560-image.png

    For more details, kindly visit: Folder and file filter examples

    Approach 2:

    1. Use Getmetadata activity having dataset with blank file path. Select 'child items' in the Field list. This will fetch the list of all folders in ADLS.
    2. Use filter activity to filter out the folders you want to extract data from by providing expression eg: '@contains(item().name,'adls')' .
    3. Use Foreach activity to iterate through these folders. Inside foreach activity, use execute pipeline activity and create a child pipeline.
    4. In the child pipeline, use get metadata activity having parameterized dataset . which would take foldername from pipeline parameter being passed from parent pipeline. Select 'child items' in the field list. This will fetch the list of files present in each folders.
    5. Use Foreach activity to iterate through each files and add a copy activity within the foreach having parameterized source and sink datasets .

    For more details, kindly check out : https://www.youtube.com/watch?v=36UrhwoOKUk&list=PLsJW07-_K61KkcLWfb7D2sM3QrM8BiiYB&index=13
    You can follow the same steps explained in the video , except if condition , you can directly use copy activity

    Hope this will help. Please let us know if any further queries.

    ------------------------------

    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you.
      Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
    2 people found this answer helpful.
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.