How can I create a pipeline in Data Factory with dynamic storage account names?

Oscar Ojeda (US) 20 Reputation points
2023-01-11T18:37:41.57+00:00

I need to create a pipeline in order to copy files from one storage account to another in Data Factory. This storage accounts source and destinations are going to be dynamic, so one time could be from A to B, but other time can be from C to D. So files sources and storage accounts needs to be set as a parameter. Can this be done?

If yes, how can I do this?

If no, is there some other service in Azure where I can accomplish this task?

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,357 questions
Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
2,722 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,645 questions
0 comments No comments
{count} votes

Accepted answer
  1. Nandan Hegde 29,896 Reputation points MVP
    2023-01-11T19:14:55.1+00:00

    Hey, You can use parameterized linked services to achieve this. Use a lookup activity to pass the necessary parameters at run time to the datasets whichin turn would pass it to linked services that are present within copy activity

    0 comments No comments

0 additional answers

Sort by: Most helpful