Azure Data Factory Pipeline to Update a Azure Table Storage record.

Michael Giusto 1 Reputation point
2022-04-06T21:13:06.507+00:00

I have a Logic-App where I move files from an FTP site to a Blob storage, After I successfully move the file I create an entry in an Azure Table in the storage account with some information about the file I placed in Blob storage, things like Filename, Filepath, TimeStamp and a Status field and I set the status to "100".

At the end of the Logic-App I then create a pipeline run which imports the file contents from blob storage to an Azure SQL Table.

In this pipeline run I would now like to update the Azure Table and change the Status to "500" for the FileName that was just imported.

I know it is possible to just add another step in my Logic-App to change the Status based on successful pipeline run, but my ultimate goal is to separate the pipeline to run via it's own trigger and not be run by my Logic-App.

Currently for every file I need to process I have 1 Logic-App and 1 Pipeline, I need to streamline this because it's becoming cumbersome creating a new Logic-App for every new file I need to work with. So my final goal is to alter my Logic-App to be able to process any file that comes into the FTP site and place it in a large Blob Storage location where all files will reside and write records to the Azure Table in the process.

I will then use the individual pipelines to check the Azure Table to see which items are Status "100" that it needs import into a DB.

Azure Table Storage
Azure Table Storage
An Azure service that stores structured NoSQL data in the cloud.
156 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,341 questions
{count} votes

1 answer

Sort by: Most helpful
  1. MartinJaffer-MSFT 26,011 Reputation points
    2022-04-07T20:53:03.373+00:00

    Hello @Michael Giusto ,
    Thanks for the question and using MS Q&A platform.

    As I understand the overall goal is to have a process that:

    1. Determines what files on SFTP need to be processed
    2. Copies from SFTP to Blob storage
    3. inserts a record into table storage confirming upload state, and other metadata
    4. Copies from Blob storage to Azure SQL
    5. updates the earlier record to confirm the data is in SQL

    Currently you are using both Logic App and Data Factory. I am confused why you are using both services, wouldn't it be easier to put everything in one or the other?

    it's becoming cumbersome creating a new Logic-App for every new file I need to work with.

    This would be solved by step 0. I need to know more how you choose files. Is it by file modified date on the SFTP or something else?

    Please do let me if you have any queries.

    Thanks
    Martin


    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    0 comments No comments