How to read/write files to a folder inside a container using a python script stored in blob storage and trigger it using data factory?

Samyak 41 Reputation points
2022-09-07T08:36:43.423+00:00

I have the following hierarchy:

container -> input folder -> input files
container -> output folder

I know how to read files when there is no folder and there's a file directly in the container, but what python script can I use to access files within a folder in a container and store them to another folder?

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,427 questions
Azure Batch
Azure Batch
An Azure service that provides cloud-scale job scheduling and compute management.
302 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,544 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. ShaikMaheer-MSFT 37,896 Reputation points Microsoft Employee
    2022-09-08T09:08:22.773+00:00

    Hi @Samyak ,

    Thanks for posting query in Microsoft Q&A Platform.

    You can use custom activity to do same. Click here to see documentation of same.

    Kindly check below video also that gives better idea.
    Azure Data Factory - Execute Python script from ADF

    You can use copy activity directly in Azure data factory to perform your task. No need to python code actually. We need to create datasets for Source and sink locations and use them in copy activity. Please check below Copy activity documentation to get better idea.
    https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-overview

    Hope this helps. Please let me know if any further queries.

    -------
    Please consider hitting Accept Answer and Up-Vote. Accepted answers help community as well.