Azure data factory: pipeline parameter in sink

Müller, André 71 Reputation points
2021-12-01T08:11:31.913+00:00

I´m using a parameter within a pipeline and loop through another pipeline to load data from source (json) to target (database). I want to save the parameter as field in target dataset. Using data flow would solve this problem but using a data copy would be prefered. How can i use a pipeline parameter in data copy mapping?

thx in advance

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,851 questions
{count} votes

Accepted answer
  1. MartinJaffer-MSFT 26,096 Reputation points
    2021-12-01T20:51:19.097+00:00

    Hello @Müller, André and welcome to Microsoft Q&A. If I understand your ask correctly, it is possible to write a parameter into a sink column. However your need to go to the Copy Activity source options first, that is where the feature is.

    The feature you are looking for is called "Additional columns". You will want to click + to add a new one. After naming your new (source) column, select "Dynamic Content" in the middle drop-down menu. Then you will be prompted to enter the reference to the parameter. See image below.
    154234-additional-colums.jpg

    Once this is done, go to the Copy Activity mapping, and re-do the "import schemas" to make the new column show up on the source (left) side. Probably at the bottom of the list. Then you can set what it should map to in sink (right).

    It is possible to add the new mapping without re-importing schemas, but it is easy to make a typo when adding manually.


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.