Need workaround to make event trigger on Datalake ADLS Gen1

Simhadri Saripalli 1 Reputation point
2020-07-23T10:35:07.917+00:00

I need to find out a workaround solution to trigger my ADF V2 pipeline when files are placed in data lake ADLS Gen1. So anyone tell me an alternative approach to make a workable solution. Any reference solutions or links are appreciated.

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
0 comments No comments
{count} votes

3 answers

Sort by: Most helpful
  1. HarithaMaddi-MSFT 10,151 Reputation points
    2020-07-23T12:25:18.713+00:00

    Hi @SimhadriSaripalli-2399,

    Welcome to Microsoft Q&A Platform.

    Workaround can be to use Azure Functions with Blob Trigger enabled as shown below. Code can be added further into this function according to the requirement once file is read. Please check below useful links as well.

    13485-functionblobtrigger.gif

    Hope this helps! Do let us know for further queries.


    Please do consider to click on "Accept Answer" and "Up-vote" on the post that helps you, as it can be beneficial to other community members


  2. Vaibhav Chaudhari 38,971 Reputation points Volunteer Moderator
    2020-07-23T13:13:47.08+00:00

    One stupid workaround, not sure if that will work for you.

    Ask the source team to push dummy.txt file in Gen2 when they push data to Gen1. Create event trigger to point to dummy file in Gen2.

    This way, pipeline is triggered as soon as dummy.txt file is created or modified.

    ===============================================
    If the response helped, do "Accept Answer" and upvote it -- Vaibhav


  3. Chand, Anupam SBOBNG-ITA/RX 471 Reputation points
    2022-02-11T03:11:00.477+00:00

    This is a bit of an old question but still seems like the answer is missing. First ADLS gen 1 is not a blob, so you cannot use blob trigger.
    What you can do is, set up the diagnostic logging as shown HERE. Feed the logs into an Eventhub. Then have an Azure function listening on the Eventhub link. The Azure function on Consumption plan is needed because it is cheap(it will cheaper than a logic app) and you need to do some filtering. Diagnostic logs will capture a lot of information but you would only want information on when a new file is loaded. When you do get such an event, you can trigger your data factory using the Rest API from Azure function itself Link.

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.