How can I create a pipeline in Data Factory with dynamic storage account names?

asked 2023-01-11T18:37:41.57+00:00
Oscar Ojeda (US) 20 Reputation points

I need to create a pipeline in order to copy files from one storage account to another in Data Factory. This storage accounts source and destinations are going to be dynamic, so one time could be from A to B, but other time can be from C to D. So files sources and storage accounts needs to be set as a parameter. Can this be done?

If yes, how can I do this?

If no, is there some other service in Azure where I can accomplish this task?

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
891 questions
Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
1,536 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
6,258 questions
No comments
{count} votes

Accepted answer
  1. answered 2023-01-11T19:14:55.1+00:00
    Nandan Hegde 21,606 Reputation points

    Hey, You can use parameterized linked services to achieve this. Use a lookup activity to pass the necessary parameters at run time to the datasets whichin turn would pass it to linked services that are present within copy activity

    No comments

0 additional answers

Sort by: Most helpful