On the Real-Time hub page, select + Data sources under Connect to on the left navigation menu. You can also get to the Data sources page from All data streams or My data streams pages by selecting the + Connect data source button in the top-right corner.
Add Apache Kafka as a source
Here are the steps to add an Apache Kafka topic as a source in Fabric Real-Time hub.
On the Select a data source page, select Apache Kafka.
On the Connect page, select New connection.
In the Connection settings section, for Bootstrap Server, enter your Apache Kafka server address.
In the Connection credentials section, If you have an existing connection to the Apache Kafka cluster, select it from the drop-down list for Connection. Otherwise, follow these steps:
For Connection name, enter a name for the connection.
For Authentication kind, confirm that API Key is selected.
For Key and Secret, enter API key and key Secret.
Select Connect.
Now, on the Connect page, follow these steps.
For Topic, enter the Kafka topic.
For Consumer group, enter the consumer group of your Apache Kafka cluster. This field provides you with a dedicated consumer group for getting events.
Select Reset auto offset to specify where to start reading offsets if there's no commit.
For Security protocol, the default value is SASL_PLAINTEXT.
Note
The Apache Kafka source currently supports only unencrypted data transmission (SASL_PLAINTEXT and PLAINTEXT) between your Apache Kafka cluster and Eventstream. Support for encrypted data transmission via SSL will be available soon.
The default SASL mechanism is typically PLAIN, unless configured otherwise. You can select the SCRAM-SHA-256 or SCRAM-SHA-512 mechanism that suits your security requirements.
Select Next. On the Review and create screen, review the summary, and then select Add.
View data stream details
On the Review + connect page, if you select Open eventstream, the wizard opens the eventstream that it created for you with the selected Apache Kafka source. To close the wizard, select Close at the bottom of the page.
In Real-Time hub, switch to the Data streams tab of Real-Time hub. Refresh the page. You should see the data stream created for you.
This article describes how to analyze data streams in Real-Time hub. Process using transformations in eventstreams, add Eventhouse destination to send it to a Kusto Query Language (KQL) table and analyze it.
This article describes how to process data streams in Real-Time hub. Process using transformations in eventstreams, add Eventhouse destination to send it to a KQL table and analyze it, and set alerts.
This article describes how to get OneLake events as a Fabric eventstream in Real-Time hub. You can transform the events and send them to supported destinations.