Triggers on Azure Data Factory

Hugo Dantas 26 Reputation points
2021-04-16T14:34:45.847+00:00

Hi,

I have a question about triggers in Azure Data Factory.
My scenario is: I have 4k+ databases (one per client, distributed on many servers and elastic pools) on each database have a log table. I want to copy data from this table to a Blob Storage, creating one Blob per database.

I have one Azure Function that gets the database, server and elastic pool names and insert them into a storage queue in a JSON format. I have the plans to use that data to use on the pipeline in Data Factory.

But the Data Factory does not have a queue trigger, only a storage trigger, so instead of inserting those data on a queue, I created a json blob file that will be used for a trigger. But Data Factory I'm not finding the right way to make Data Factory read the contents of the triggered file.

Is there any way to achieve this? To make the Data Factory read the contents of the blob storage and use it as a Object Parameter to a pipeline?

Trigger:
88550-screenshot-2021-04-16-113157.png

Activity
88651-screenshot-2021-04-16-113408.png

JSON File
{
"Name": "DatabaseName",
"Pool": "ElasticPoolName",
"Server": "ServerName"
}

Thank you;

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
0 comments No comments
{count} vote

Answer accepted by question author
  1. MartinJaffer-MSFT 26,161 Reputation points
    2021-04-16T15:10:52.367+00:00

    Hello @Hugo Dantas and welcome to Microsoft Q&A. Please let me share a few ideas with you.

    For the solution you are currently pursuing, you would use a Lookup activity to read contents of your trigger-blob. This does not get put into a parameter, as parameter is defined when the pipeline starts. The trigger itself cannot read the contents of the blob. To interpret a string as an object, use the @json(blob_contents) function.

    However, there is a way to more directly trigger the pipeline. Instead of using blob event triggers, have you considered using your Azure Function to trigger the pipeline run directly?
    You can use the REST API , the body can carry parameters.
    There is also a .NET method.
    And others.
    The Powershell actually takes a file as parameter input.


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.