Can we get a selector to choose trigger values which can be passed to a pipeline when triggered?

PeterSh 176 Reputation points
2021-10-18T03:27:19.277+00:00

I have a blob creation trigger set and I wanted to be able to move the triggering file once it was processed. After a bit of hunting around, I found that we can pass the name of the file which triggered the storage event to the pipeline.

  1. Create a parameter on the pipeline
  2. Attach a storage trigger to the pipeline
  3. When prompted for 'Trigger run parameters', set the pipeline parameter created in step 1 to '@Trigger ().outputs.body.fileName'
  4. For pipeline tasks which you want to work with the triggering file, set the file of the dataset to the parameter from step 1

All well and good. So long as you set the container and directory appropriately, you can now work specifically with the triggering file.

My question here is, would it be possible to get a dropdown list of available variables here?

141175-image.png

It would really help make this obvious, and who knows what else might be exposed which could be useful for our pipelines.

That is a little more long term though, so in the interim, is there a resource somewhere that shows exactly what is available to use here?

Thanks,

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,525 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. KranthiPakala-MSFT 46,422 Reputation points Microsoft Employee
    2021-10-19T06:19:38.617+00:00

    Hi @PeterSh ,

    Welcome to Microsoft Q&A forum and thanks for posting your query.

    Here is a public document where we have a list of System Variable that are supported by Azure Data Factory and Azure Synapse. You can use these variables in expressions when defining entities within either service.

    Here is the reference document: System variables supported by Azure Data Factory and Azure Synapse Analytics

    141549-image.png

    141632-image.png

    Please note that for Storage event trigger scope, if you are creating your pipeline and trigger in Azure Synapse Analytics, you must use @trigger().outputs.body.fileName and @trigger().outputs.body.folderPath as parameters. Those two properties capture blob information. Use those properties instead of using @triggerBody().fileName and @triggerBody().folderPath.

    If you have any feedback regarding product feature request, we encourage you to please log a feedback directly from ADF UI as shown below:

    141633-image.png

    Hope this info helps. Do let us know if you have further query

    ----------

    • Please don't forget to click on 130616-image.png and upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators