Edit

Share via


Build event-driven Pipelines with OneLake events and Azure Blob Storage events

Today’s organizations demand real-time responsiveness from their analytics platforms. When data processing relies on scheduled job runs, insights and actions are delayed, and decisions are based on stale data. Whether your data lands in Azure Blob Storage or Fabric OneLake, it should be processed the moment it arrives to ensure timely decisions and continuous data freshness. Fabric events and Azure events make that possible by enabling event-driven data workflows that react in real-time to new data, without manual triggers or schedules.  

In this article, you learn how to configure an event-driven pipeline that automatically gets triggered when a new file lands in OneLake or Azure Blob Storage, to ingest and transform the new file. 

Why event-driven workflows? 

Fabric jobs, like pipelines and notebooks, can be scheduled to run at fixed intervals, but data doesn’t always arrive on a predictable schedule. This mismatch can lead to stale data and delayed insights. Fabric events and Azure events solve this problem by emitting events when a file is created, updated, or deleted in OneLake or Azure blob storage. These events can be consumed by Activator that can trigger Fabric items (for example, pipelines or notebooks) or Power Automate workflows. 

This event-driven workflow enables:

  • Faster time-to-insight with real-time data processing
  • Reduced cost by eliminating unnecessary job (that is, pipeline or notebook) runs
  • Greater automation and responsiveness in your data workflows

Automatically ingest and process files with an event-driven pipeline

In this tutorial you develop a solution that performs the following operations:

  1. Monitors a folder in OneLake for new CSV files

  2. Triggers a Fabric pipeline when a file is created

  3. Processes and loads the data into a Lakehouse table, without any manual intervention or a schedule.

    Screenshot of a diagram showing the architecture of the solution.

Create a lakehouse 

First, Let’s create a lakehouse where you can upload the CSV files and have the resulting table. 

  1. Open another web browser tab, sign in to Microsoft Fabric.

  2. Select My workspace on the left navigation bar.

  3. On the workspace page, select New item.

  4. In the New item pane, select Lakehouse in the Store data section.

    Screenshot of the New item pane with Lakehouse selected.

  5. In the New lakehouse window, enter TutorialLakehouse for the name, and select Create 

  6. Right-click on the Files folder, then select New subfolder 

    Screenshot of the lakehouse page with the New subfolder menu highlighted.

  7. Name the subfolder Source and select Create 

Build your pipeline 

Next, configure a pipeline to ingest, transform, and deliver the data in your Lakehouse. 

  1. Open another web browser tab, sign in to Microsoft Fabric using the same account.

  2. Select My workspace on the left navigation bar.

  3. On the workspace page, select New item.

  4. In the New item pane, select Pipeline in the Get data section.

  5. Name it TutorialPipeline and select Create 

  6. In the Pipeline, select Pipeline activity, and then select Copy data.

    Screenshot of the Pipeline page with the Copy Pipeline activity.

  7. Copy data with these properties:

    1. In the General tab, enter CSVtoTable for Name.

      Screenshot of the General tab for the Copy activity.

    2. In the Source tab, do these steps:

      1. For Connection, select the TutorialLakehouse you created earlier.

      Screenshot of the Source tab for the Copy activity with TutorialLakehouse selected.

      1. For Root folder, select Files.
      2. For File path, select Source for the Directory.
      3. For File format, select DelimitedText.

      Screenshot of the Source tab for the Copy activity with all the fields filled.

    3. In the Destination tab:

      1. For Connection, select TutorialLakehouse.
      2. For Root folder, select Tables.
      3. For Table, select + New, and enter Sales for the table name.

      Screenshot of the Destination tab for the Copy activity with all the fields filled.

    4. In the Mapping tab:

      1. Add two mappings:
        1. date -> Date

        2. total -> SalesValue

          Screenshot of the Mapping tab for the Copy activity.

  8. Save the pipeline using Save button on the toolbar at the top.

    Screenshot of pipeline editor.

Set up an alert using Fabric Activator

  1. Open another web browser tab, and sign in to Microsoft Fabric using the same account.

  2. On the left navigation var, select Real-Time.

  3. In Real-Time hub, select Fabric Events

  4. Hover over OneLake events to, and select Set Alert button (or) select ... (ellipsis), and then select Set alert to start configuring your alert. 

    Screenshot of the Real-Time hub with Set alert menu option selected for OneLake events.

  5. In the Set alert window, for Source, choose Select events.

    Screenshot of the Set alert window.

  6. In the Configure connection settings window, for Event type, select Microsoft.Fabric.OneLake.FileCreated and Microsoft.Fabric.OneLake.FileDeleted events.

    Screenshot of the Configure connection settings window with the File Created and File Deleted events selected.

  7. In the Select data source for events section, select Add a OneLake source.

    Screenshot of the Configure connection settings window with the Add a OneLake source button highlighted.

  8. In the OneLake catalog window, select TutorialLakehouse, and then select Next.

    Screenshot of the OneLake catalog window with the TutorialLakehouse selected.

  9. On the next page, expand Files, select Source, and then select Next.

    Screenshot of the OneLake catalog window with Source folder selected.

  10. On the Configure connection settings page, select Next.

  11. On the Review + connect page, select Save.

  12. Now, in the Set alert pane, follow these steps:

    1. For Action, select Run a Fabric item.

    2. For Workspace, select the workspace where you created the pipeline.

    3. For Item, select TutorialPipeline.

    4. In the Save location section, select the workspace where you want to create a Fabric activator item with the alert.

    5. For Item, select Create a new item.

    6. For New item name, enter TutorialActivator.

    7. Select Create.

      Screenshot of the Set alert window with Run a fabric item option selected for the action.

This setup ensures your pipeline runs instantly whenever a new file appears in the source folder.

Screenshot of the Configure Connection Settings window.

Test the workflow

To test your workflow:

  • Upload this CSV file to the Source folder in your TutorialLakehouse. Close the Upload files pane after you uploaded the file.

    Screenshot of the OneLake page with the Upload files menu selected.

  • A FileCreated event is emitted to trigger the TutorialPipeline through TutorialActivator

  • After processing, you’ll see the Sales table that includes the newly ingested and transformed data ready for use. As it's the first time you dropped a file, give it a few minutes to see the table with data.

    Screenshot of the Lakehouse with Sales table highlighted.

No manual refresh. No waiting for the next scheduled run. Your pipeline runs in real-time.

The result is Seamless automation With just a few steps, you built a responsive, event-driven workflow. Every time data lands in your Lakehouse as a file, it’s automatically ingested, transformed, and ready for downstream analytics. 

While this tutorial focused on OneLake Events, you can achieve the same scenario using Azure Blob Storage events. 

More use cases for event-driven scenarios 

Beyond the use case we explored, here are more scenarios where you can use OneLake and Azure Blob Storage events in Microsoft Fabric: 

  • Trigger a Notebook through Activator for advanced data science preprocessing  
  • Forward events to webhook through Eventstreams for custom compliance and data quality scans. 
  • Get alerted when critical datasets are modified through Activator’s Teams and E-mail notifications.