REST API incrementally loaded into a dataset

Stef G 1 Reputation point
2022-08-26T14:27:46.08+00:00

Hello all,

I am trying to figure out the best way to

  1. Incrementally update a dataset (SQL database, Json object etc) using a REST API call
  2. Transform and split it into separate tables (eg using Azure Data factory)
  3. Feed the tables to Power Bi for further analysis

There are many different recommendations / services in Azure and I have been overwhelmed, since I cannot find a straight forward answer.

Do you have any recommendations on how to achieve it efficiently ? What Azure services, data storage (blob, SQL etc) and workflow can produce the best result ?

Kind regards,
Stefanos

Azure SQL Database
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,526 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Stef G 1 Reputation point
    2022-08-31T18:18:49.767+00:00

    Hello @MartinJaffer-MSFT ,

    Thank you for your reply and sorry for the delay on my side.

    Let's forget the Power bi part (maybe I shouldn't had mentioned it) and focus on the dataset part.

    The data is coming from a REST API call and I want to trigger a call that will update a local json blob that will be stored in Azure and afterwards I would like to transform the database.

    I am currently using Data Factory and exporting the transformed data into different CSV (sink).

    This process seems very inefficient to me and I am trying to figure out the best tool in order to automatically update data received from REST API into multiple tables without the need to overwrite a huge amount of data.

    Could I achieve better results with services like Kusto, Databricks or Azure streaming services ?

    Sorry if I am not describing the issue properly but I am very new in this field and especially with Azure services.

    Kind regards,
    Stefanos