Configure dataflow endpoints

Important

Azure IoT Operations Preview – enabled by Azure Arc is currently in preview. You shouldn't use this preview software in production environments.

You'll need to deploy a new Azure IoT Operations installation when a generally available release is made available. You won't be able to upgrade a preview installation.

See the Supplemental Terms of Use for Microsoft Azure Previews for legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.

To get started with dataflows, first create dataflow endpoints. A dataflow endpoint is the connection point for the dataflow. You can use an endpoint as a source or destination for the dataflow. Some endpoint types can be used as both sources and destinations, while others are for destinations only. A dataflow needs at least one source endpoint and one destination endpoint.

Get started

To get started, use the following table to choose the endpoint type to configure:

Endpoint type Description Can be used as a source Can be used as a destination
MQTT For bi-directional messaging with MQTT brokers, including the one built-in to Azure IoT Operations and Event Grid. Yes Yes
Kafka For bi-directional messaging with Kafka brokers, including Azure Event Hubs. Yes Yes
Data Lake For uploading data to Azure Data Lake Gen2 storage accounts. No Yes
Microsoft Fabric OneLake For uploading data to Microsoft Fabric OneLake lakehouses. No Yes
Local storage For sending data to a locally available persistent volume, through which you can upload data via Azure Container Storage enabled by Azure Arc edge volumes. No Yes

Reuse endpoints

Think of each dataflow endpoint as a bundle of configuration settings that contains where the data should come from or go to (the host value), how to authenticate with the endpoint, and other settings like TLS configuration or batching preference. So you just need to create it once and then you can reuse it in multiple dataflows where these settings would be the same.

To make it easier to reuse endpoints, the MQTT or Kafka topic filter isn't part of the endpoint configuration. Instead, you specify the topic filter in the dataflow configuration. This means you can use the same endpoint for multiple dataflows that use different topic filters.

For example, you can use the default MQTT broker dataflow endpoint. You can use it for both the source and destination with different topic filters:

Screenshot using operations experience to create a dataflow from MQTT to MQTT.

Similarly, you can create multiple dataflows that use the same MQTT endpoint for other endpoints and topics. For example, you can use the same MQTT endpoint for a dataflow that sends data to a Kafka endpoint.

Screenshot using operations experience to create a dataflow from MQTT to Kafka.

Similar to the MQTT example, you can create multiple dataflows that use the same Kafka endpoint for different topics, or the same Data Lake endpoint for different tables.