How to read latest files from sub folders in Azure Data Factory

Galih 111 Reputation points

I have a s3 bucket with many country folders and each country folder have a zipped csv file.

country_a / report / ( 2022-Oct-01.csv )
country_a / report / ( 2022-Oct-02.csv )
country_b / report / ( 2022-Oct-01.csv )
country_b / report / ( 2022-Oct-02.csv )
country_c / report / ( 2022-Oct-01.csv )
country_d / report / ( 2022-Oct-01.csv )

Using ADF I need to read latest extracted csv file in each country folder :

country_a -> 2022-Oct-02.csv
country_b -> 2022-Oct-02.csv
country_c -> 2022-Oct-01.csv
country_d -> 2022-Oct-01.csv

Is it possible only use 1 pipeline to read latest extracted csv file in each country folder and copy latest csv data to azure sql?

Thank you.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,893 questions
0 comments No comments
{count} votes

Accepted answer
  1. ShaikMaheer-MSFT 38,301 Reputation points Microsoft Employee

    Hi @Galih ,

    Thank you for posting query in Microsoft Q&A Platform.

    Kindly check below video where I explained about get latest file folder and process it in azure data factory. In the video, I used azure storage account, in your case its S3 bucket. So connector which we use while creating dataset alone will change logic of implementation will be same. So kindly check it and implement the same.
    Get Latest File from Folder and Process it in Azure Data Factory

    Hope this helps. Please let me know if any further queries.


    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful