Rakesh Kumar 45 Reputation points

Hi, I have 5 tables need to store the data in data lake in table structure like:


I have done till the incremental load using data factory. But if anybody updates any data then how can we achieve this in azure blob storage

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,424 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,128 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Amira Bedhiafi 19,626 Reputation points

    I think it is not possible based on this thread :

    A workaround maybe :

    You can work on incremental updates to a Blob Storage file using Azure Databricks and in ADF you can call the databricks notebook activity

    In this you will have to mount the Azure storage on databricks cluster first

    Then you will have to read the original and incremental file or data as a dataframe

    You can perform joins on the files and create an incremental data frame which you can overwrite to your original file which will be your incremental data and that too in a single file