Container name getting added to the @triggerBody().folderPath

Jaganath Kumar 110 Reputation points
2024-04-04T13:22:51.9166667+00:00

Hi Team,

I have a setup of storage event triggers for a pipeline to move the file from blob to sftp. The trigger has two parameter @triggerBody().folderPath and @triggerBody().filename. Blob path begins with 'Load/Customer' and blob path ends with '.txt'. So any files that arrives into 'Load/Customer' with '.txt' at end, the pipeline would trigger and move the file to sftp. But along with path container name 'datafactorycontainer' is getting added 'datafactorycontainer/Load/Customer/'.

Can someone please help me on this issue?

Appreciate your help.

Thanks,

Jaganath

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,429 questions
{count} votes

Accepted answer
  1. Konstantinos Passadis 19,496 Reputation points MVP
    2024-04-04T13:34:02.69+00:00

    Hello @Jaganath Kumar

    In Azure Data Factory (ADF), when you set up a storage event trigger for a pipeline, the trigger details, including the blob path, are passed to the pipeline through @triggerBody().folderPath and @triggerBody().filename. If the container name 'datafactorycontainer' is being included in the path unexpectedly, it may affect the subsequent actions in your pipeline, especially if you're using the path to reference the file location.

    To handle this situation, you have a few options:

    String Manipulation in the Pipeline: You can manipulate the string in your pipeline to remove the container name. For example, if you only need the 'Load/Customer/filename.txt' part, you can use an expression to extract it. Here’s an example expression that might be used in a Set Variable or a Derived Column activity:

    jsonCopy code
    @replace(triggerBody().folderPath
    

    This expression replaces 'datafactorycontainer/' with an empty string, effectively removing it from the path.

    Adjusting the Trigger: Check the configuration of your trigger. Ensure that the path you are monitoring for events is specified correctly. In most cases, the path should start directly with the folder path inside the container, not including the container name.

    Blob Path in Activities: When you use the path in subsequent activities (like a copy activity), ensure you're constructing the path correctly. If the container name is not needed, exclude it from the path you're using in the activity's settings.

    Debugging and Logging: Add a Logging or Debugging activity in your pipeline to print out the paths you're receiving and the paths you're constructing. This can help you identify where exactly the issue might be and how the path is being handled through the pipeline.

    Remember that in Azure Data Factory, paths are typically relative to the container, so including the container name in the path is not usually necessary unless you're referencing it in a Linked Service or some other configuration where the full path is required.


    I hope this helps!

    The answer or portions of it may have been assisted by AI Source: ChatGPT Subscription

    Kindly mark the answer as Accepted and Upvote in case it helped or post your feedback to help !

    Regards


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.