question

Samyak-3746 avatar image
0 Votes"
Samyak-3746 asked ShaikMaheer-MSFT commented

How to read/write files to a folder inside a container using a python script stored in blob storage and trigger it using data factory?

I have the following hierarchy:

container -> input folder -> input files
container -> output folder

I know how to read files when there is no folder and there's a file directly in the container, but what python script can I use to access files within a folder in a container and store them to another folder?

azure-data-factoryazure-blob-storageazure-batch
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

ShaikMaheer-MSFT avatar image
0 Votes"
ShaikMaheer-MSFT answered ShaikMaheer-MSFT commented

Hi @Samyak-3746 ,

Thanks for posting query in Microsoft Q&A Platform.

You can use custom activity to do same. Click here to see documentation of same.

Kindly check below video also that gives better idea.
Azure Data Factory - Execute Python script from ADF

You can use copy activity directly in Azure data factory to perform your task. No need to python code actually. We need to create datasets for Source and sink locations and use them in copy activity. Please check below Copy activity documentation to get better idea.
https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-overview

Hope this helps. Please let me know if any further queries.


Please consider hitting Accept Answer and Up-Vote. Accepted answers help community as well.

· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi @Samyak-3746 , Just checking in to see if the above answer helped. If this answers your query, do click 130616-image.png and upvote 130671-image.png for the same. And, if you have any further query do let us know.

0 Votes 0 ·