How to use Event Grid message data (json) as data source for copy / dataflow activity

Gandhi, Sanjai 0 Reputation points
2024-07-01T20:52:59.69+00:00

We are working on the proof of concept where we will be populating he enriched data into event grid topics. I need to have data pipeline in ADF to receive this data and insert into the SQL tables. I'm unable to use the event grid data as data source in copy data or dataflow task. Please help how do we achieve this senario.

Thanks,

San.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,122 questions
Azure Event Grid
Azure Event Grid
An Azure event routing service designed for high availability, consistent performance, and dynamic scale.
349 questions
{count} votes

1 answer

Sort by: Most helpful
  1. phemanth 8,485 Reputation points Microsoft Vendor
    2024-07-02T04:29:38.3266667+00:00

    @Gandhi, Sanjai

    To achieve this scenario, you can use Storage Event Triggers in Azure Data Factory (ADF). These triggers allow you to respond to events on a storage account (such as file arrival or deletion in Azure Blob Storage) and trigger pipelines accordingly. Here’s how you can set it up:

    1. Register with Event Grid:
      • Ensure your subscription is registered with the Event Grid resource provider.
      • If you’re using this feature in Azure Synapse Analytics, also register your subscription with the Data Factory resource provider.
    2. Configure Network Rules:
      • If your blob storage account resides behind a private endpoint and blocks public network access, configure network rules to allow communication from blob storage to Azure Event Grid.
      • You can either grant storage access to trusted Azure services (like Event Grid) or configure private endpoints for Event Grid.
    3. Supported Storage Accounts:
      • The Storage Event Trigger currently supports only Azure Data Lake Storage Gen2 and General-purpose version 2 storage accounts.
      • If you’re working with SFTP Storage Events, specify the SFTP Data API under the filtering section.
      • Due to an Azure Event Grid limitation, Azure Data Factory supports a maximum of 500 storage event triggers per storage account.
      • Ensure that the Azure account used to log into the service and publish the storage event trigger has appropriate role-based access control (Azure RBAC) permissions on the storage account.
    4. Create a Storage Event Trigger:
      • In your ADF pipeline, create a new Storage Event Trigger.
      • Configure it to listen for events on your storage account (e.g., new files in Blob Storage).
      • When an event occurs, the trigger will activate your pipeline.
    5. Custom Event Payload (Optional):
      • If you need to parse custom data from the event payload and pass it to your pipeline, create pipeline parameters.
      • Use the format @triggerBody().event.data.keyName to extract values from the event payload.

    Remember that Azure Data Factory natively integrates with Azure Event Grid, making it a powerful solution for event-driven data pipelines. If you encounter any issues during setup, feel free to ask for further assistance!

    For more detailed information, you can refer to the official documentation.

    Hope this helps. Do let us know if you any further queries.

    0 comments No comments