Upload Historical Data to Azure TSI Gen2

Sivaji Gudipati 26 Reputation points
2021-06-15T17:58:05.407+00:00

I have my time-series data in a Postgres Database which gets updated in real-time, I need to ingest it into Azure TSI. The database already has 6 months of data in it and we can only access the data using Rest APIs. I have created a 30 min timer-triggered Azure Function to fetch the data of the last 30 min from the Postgres database, and used python SDK to send this data to Azure Event Hub in batches. This Event hub is the event source to my TSI. I was able to fetch this data to TSI now. However, I was wondering what is the best process to ingest the historical data? I have 6-month data for every 5-minute interval with around 9 columns. I was wondering if there is any way that I can upload this data using a parquet file? If possible can you please provide me the documentation for that?

Azure Time Series Insights
Azure Time Series Insights
An Azure internet of things (IoT) analytics platform to monitor, analyze, and visualize industrial IoT analytics data at scale.
78 questions
{count} votes

Accepted answer
  1. QuantumCache 20,271 Reputation points
    2021-06-25T05:56:04.113+00:00

    Hello @Sivaji Gudipati

    Thanks for your patience in this matter, we received a response from the Product Team on your initial query.

    Streaming ingestion from Hub is the only supported method of ingestion. It's not recommended to stream large amounts of historical data, but you can do it with some caveats: Streaming ingestion event sources - Azure Time Series Insights Gen2 | Microsoft Learn.

    Please leave your comment or feedback in the below section.

    If the response is helpful, please click "Accept Answer" and upvote it.

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.