Plan your Azure Time Series Insights Gen2 environment
Note
The Time Series Insights service will be retired on 7 July 2024. Consider migrating existing environments to alternative solutions as soon as possible. For more information on the deprecation and migration, visit our documentation.
This article describes best practices to plan and get started quickly by using Azure Time Series Insights Gen2.
Best practices for planning and preparation
Best practices surrounding planning for and preparing your environment are described further in the following articles:
- What you get when you provision an Azure Time Series Insights Gen2 environment.
- What your Time Series IDs and Timestamp properties are.
- What the new Time Series Model is, and how to build your own.
- How to send events efficiently in JSON.
- Azure Time Series Insights business disaster recovery options.
Azure Time Series Insights employs a pay-as-you-go business model. For more information about charges and capacity, read Azure Time Series Insights pricing.
The Gen2 environment
When you provision an Azure Time Series Insights Gen2 environment, you create two Azure resources:
- An Azure Time Series Insights Gen2 environment
- An Azure Storage account
As part of the provisioning process, you specify whether you want to enable a warm store. Warm store provides you with a tiered query experience. When enabled, you must specify a retention period between 7 and 30 days. Queries executed within the warm store retention period generally provide faster response times. When a query spans over the warm store retention period, it's served from cold store.
Queries on warm store are free, while queries on cold store incur costs. It's important to understand your query patterns and plan your warm store configuration accordingly. We recommend that interactive analytics on the most recent data reside in your warm store and pattern analysis and long-term trends reside in cold.
Note
To read more about how to query your warm data, read the API Reference.
To start, you need three additional items:
- A Time Series Model
- An event source connected to Time Series Insights
- Events flowing into the event source that are both mapped to the model and are in valid JSON format
Review Azure Time Series Insights Gen2 limits
Property limits
Azure Time Series Insights property limits have increased to 1,000 for warm storage and no property limit for cold storage. Supplied event properties have corresponding JSON, CSV, and chart columns that you can view within the Azure Time Series Insights Gen2 Explorer.
SKU | Maximum properties |
---|---|
Gen2 (L1) | 1,000 properties (columns) for warm storage and unlimited for cold storage |
Gen1 (S1) | 600 properties (columns) |
Gen1 (S2) | 800 properties (columns) |
Streaming Ingestion
There is a maximum of two event sources per environment.
The best practices and general guidance for event sources can be found here
By default, Azure Time Series Insights Gen2 can ingest incoming data at a rate of up to 1 megabyte per second (MBps) per Azure Time Series Insights Gen2 environment. There are additional limitations per hub partition. Rates of up to 2 MBps can be provided by submitting a support ticket through the Azure portal. To learn more, read Streaming Ingestion Throughput Limits.
API limits
REST API limits for Azure Time Series Insights Gen2 are specified in the REST API reference documentation.
Configure Time Series IDs and Timestamp properties
To create a new Azure Time Series Insights environment, select a Time Series ID. Doing so acts as a logical partition for your data. As noted, make sure to have your Time Series IDs ready.
Important
Time Series IDs can't be changed later. Verify each one before final selection and first use.
You can select up to three keys to uniquely differentiate your resources. For more information, read Best practices for choosing a Time Series ID and Ingestion rules.
The Timestamp property is also important. You can designate this property when you add event sources. Each event source has an optional Timestamp property that's used to track event sources over time. Timestamp values are case sensitive and must be formatted to the individual specification of each event source.
When left blank, the time when the event was enqueued into the IoT Hub or Event Hub is used as the event Timestamp. In general, users should opt to customize the Timestamp property and use the time when the sensor or tag generated the reading, rather than the hub enqueued time. For more information and to read about time zone offsets read Event source timestamp.
Understand the Time Series Model
You can now configure your Azure Time Series Insights environment's Time Series Model. The new model makes it easy to find and analyze IoT data. It enables the curation, maintenance, and enrichment of time series data and helps to prepare consumer-ready data sets. The model uses Time Series IDs, which map to an instance that associates the unique resource with variables, known as types, and hierarchies. Read about the Time Series Model overview to learn more.
The model is dynamic, so it can be built at any time. To get started quickly, build and upload it prior to pushing data into Azure Time Series Insights. To build your model, read Use the Time Series Model.
For many customers, the Time Series Model maps to an existing asset model or ERP system already in place. If you don't have an existing model, a prebuilt user experience is provided to get up and running quickly.
Shape your events
You can verify the way that you send events to Azure Time Series Insights. Ideally, your events are denormalized well and efficiently.
A good rule of thumb:
- Store metadata in your Time Series Model.
- Ensure that Time Series Mode, instance fields, and events include only necessary information, such as a Time Series ID or Timestamp property.
For more information and to understand how events will be flattened and stored, read the JSON flattening and escaping rules.
Business disaster recovery
This section describes features of Azure Time Series Insights that keep apps and services running, even if a disaster occurs (known as business disaster recovery).
High availability
As an Azure service, Azure Time Series Insights provides certain high availability features by using redundancies at the Azure region level. For example, Azure supports disaster recovery capabilities through Azure's cross-region availability feature.
Additional high-availability features provided through Azure (and also available to any Azure Time Series Insights instance) include:
- Failover: Azure provides geo-replication and load balancing.
- Data restoration and storage recovery: Azure provides several options to preserve and recover data.
- Azure Site Recovery: Azure provides recovery features through Azure Site Recovery.
- Azure Backup: Azure Backup supports both on-premises and in-cloud backup of Azure VMs.
Make sure you enable the relevant Azure features to provide global, cross-region high availability for your devices and users.
Note
If Azure is configured to enable cross-region availability, no additional cross-region availability configuration is required in Azure Time Series Insights.
IoT and event hubs
Some Azure IoT services also include built-in business disaster recovery features:
- Azure IoT Hub high-availability disaster recovery, which includes intra-region redundancy
- Azure Event Hubs policies
- Azure Storage redundancy
Integrating Azure Time Series Insights with the other services provides additional disaster recovery opportunities. For example, telemetry sent to your event hub might be persisted to a backup Azure Blob storage database.
Azure Time Series Insights
There are several ways to keep your Azure Time Series Insights data, apps, and services running, even if they're disrupted.
However, you might determine that a complete backup copy of your Azure Time Series environment also is required, for the following purposes:
- As a failover instance specifically for Azure Time Series Insights to redirect data and traffic to
- To preserve data and auditing information
In general, the best way to duplicate an Azure Time Series Insights environment is to create a second Azure Time Series Insights environment in a backup Azure region. Events are also sent to this secondary environment from your primary event source. Make sure that you use a second dedicated consumer group. Follow that source's business disaster recovery guidelines, as described earlier.
To create a duplicate environment:
- Create an environment in a second region. For more information, read Create a new Azure Time Series Insights environment in the Azure portal.
- Create a second dedicated consumer group for your event source.
- Connect that event source to the new environment. Make sure that you designate the second dedicated consumer group.
- Review the Azure Time Series Insights IoT Hub and Event Hubs documentation.
If an event occurs:
- If your primary region is affected during a disaster incident, reroute operations to the backup Azure Time Series Insights environment.
- Because hub sequence numbers restart from 0 after the failover, recreate the event source in both regions/environments with different consumer groups to avoid creating what would look like duplicate events.
- Delete the primary event source, which is now inactive, to free up an available event source for your environment. (There's a limit of two active event sources per environment.)
- Use your second region to back up and recover all Azure Time Series Insights telemetry and query data.
Important
If a failover occurs:
- A delay might also occur.
- A momentary spike in message processing might occur, as operations are rerouted.
For more information, read Mitigate latency in Azure Time Series Insights.
Next steps
- Review Azure Advisor to plan out your business recovery configuration options.
- Review Azure Advisor to plan out your business recovery configuration options.
- Read more about data ingestion in Azure Time Series Insights Gen2.
- Review the article on data storage in Azure Time Series Insights Gen2.
- Learn about data modeling in Azure Time Series Insights Gen2.