Azure Data Factory trigger event not working while uploading using .NET client library

NATASHA TAHARIYA 0 Reputation points
2025-02-25T22:26:52.95+00:00

I want to trigger an ADF pipeline on blob creation, file based trigger. The manual upload to blob storage triggers the ADF pipeline correctly. But when i am uploading the same via AzureDataLakeStore .Net libraries from a web app, specifically package - <package id="Azure.Storage.Files.DataLake" version="12.21.0" targetFramework="net472" />

method - await datalakeFileClient.UploadAsync(content: stream, overwrite: true)

It is not triggering the ADF trigger.

Any idea how to resolve this issue ?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,624 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Chandra Boorla 14,510 Reputation points Microsoft External Staff Moderator
    2025-02-26T02:13:17.5466667+00:00

    Hi @NATASHA TAHARIYA

    Thank you for posting your query!

    As I understand that the ADF trigger isn’t firing when you upload a file using the .NET client library. This happens because DataLakeFileClient.UploadAsync() writes files in an append-based manner and doesn’t immediately finalize them, which prevents the BlobCreated event from firing.

    Why Is This Happening?

    ADF triggers rely on BlobCreated events, which fire only when a file is properly closed. Manual uploads (e.g., Azure Storage Explorer) work because they finalize the file after writing. DataLakeFileClient does not finalize files automatically, so ADF doesn’t detect the new file.

    Here are couple of approaches that might help you to fix the issue:

    Approach 1 - Ensure File is Finalized (FlushAsync)

    Explicitly flush and close the file after uploading:

    // Ensure fileClient is initialized properly
    DataLakeFileClient fileClient = dataLakeFileSystemClient.GetFileClient("your-file-name");
    // Upload the file stream
    await fileClient.UploadAsync(stream, overwrite: true);  
    // Ensure the file is flushed and closed to trigger BlobCreated event
    long fileSize = stream.Length;  
    await fileClient.FlushAsync(position: fileSize, close: true);  
    

    Why? This forces the file to be committed and closed, allowing ADF to detect it.

    Approach 2 - Use BlobClient Instead (Recommended if DataLake Features Aren’t Needed)

    If you don’t need Data Lake-specific features (e.g., ACLs, hierarchical namespace), using BlobClient is simpler:

    // Ensure containerClient is properly initialized
    BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("your-container-name");
    // Get a BlobClient to interact with the blob
    BlobClient blobClient = containerClient.GetBlobClient("your-file-name");
    // Upload the file stream to the blob
    await blobClient.UploadAsync(stream, overwrite: true);
    

    Why? The Blob Storage API automatically triggers the BlobCreated event, requiring no extra steps.

    Additional Debugging Steps -

    • Check Event Grid Subscription - Go to Azure Portal → Storage Account → Events → Event Subscription and ensure its active.
    • Check ADF Trigger Settings - If "Ignore empty blobs" is enabled, make sure the uploaded file isn’t empty.
    • Check Storage Logs - If the event isn’t firing, review Storage Diagnostic Logs to confirm whether BlobCreated is raised.

    Conclusion -

    • If using DataLakeFileClient, add FlushAsync(close: true).
    • If you can switch to BlobClient, that’s the easiest fix.

    I hope this information helps. Please do let us know if you have any further queries.

    Kindly consider upvoting the comment if the information provided is helpful. This can assist other community members in resolving similar issues.

    Thank you.

    1 person found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.