Редагувати

Поділитися через


Add Confluent Cloud Kafka as source in Real-Time hub (preview)

This article describes how to add Confluent Cloud Kafka as an event source in Fabric Real-Time hub.

Note

Real-Time hub is currently in preview.

Prerequisites

  • Get access to the Fabric premium workspace with Contributor or above permissions.
  • A Confluent Cloud Kafka cluster and an API Key.

Launch Add source experience

  1. Sign in to Microsoft Fabric.

  2. Select Real-Time hub on the left navigation bar.

  3. On the Real-Time hub page, select + Add source in the top-right corner of the page.

    Screenshot that shows how to launch Real-Time hub In Microsoft Fabric.

Add Confluent Cloud Kafka as a source

  1. On the Select a data source page, select Confluent.

    Screenshot that shows the selection of Confluent as the source type in the Add source wizard.

  2. To create a connection to the Confluent Cloud Kafka source, select New connection.

    Screenshot that shows the selection of the New connection link on the Connect page of the Add source wizard.

  3. In the Connection settings section, enter Confluent Bootstrap Server. Navigate to your Confluent Cloud home page, select Cluster Settings, and copy the address to your Bootstrap Server.

  4. In the Connection credentials section, If you have an existing connection to the Confluent cluster, select it from the drop-down list for Connection. Otherwise, follow these steps:

    1. For Connection name, enter a name for the connection.
    2. For Authentication kind, confirm that Confluent Cloud Key is selected.
    3. For API Key and API Key Secret:
      1. Navigate to your Confluent Cloud.

      2. Select API Keys on the side menu.

      3. Select the Add key button to create a new API key.

      4. Copy the API Key and Secret.

      5. Paste those values into the API Key and API Key Secret fields.

      6. Select Connect

        Screenshot that shows the first page of the Confluent connection settings.

  5. Scroll to see the Configure Confluent data source section on the page. Enter the information to complete the configuration of the Confluent data source.

    1. For Topic, enter a topic name from your Confluent Cloud. You can create or manage your topic in the Confluent Cloud Console.
    2. For Consumer group, Enter a consumer group of your Confluent Cloud. It provides you with the dedicated consumer group for getting the events from Confluent Cloud cluster.
    3. For Reset auto offset setting, select one of the following values:
      • Earliest – the earliest data available from your Confluent cluster

      • Latest – the latest available data

      • None – don't automatically set the offset.

        Screenshot that shows the second page - Configure Confluent data source page - of the Confluent connection settings.

  6. In the Stream details section of the right pane, do these steps:

    1. Select the workspace where you want to save the connection.

    2. Enter a name for the eventstream to be created for you.

    3. Name of the stream for Real-Time hub is automatically generated for you.

      Screenshot that shows the right pane with Stream details section of the Confluent connection settings page.

  7. Select Next.

  8. On the Review and create screen, review the summary, and then select Create source.

View data stream details

  1. On the Review and create page, if you select Open eventstream, the wizard opens the eventstream that it created for you with the selected Confluent Cloud Kafka source. To close the wizard, select Close at the bottom of the page.

  2. In Real-Time hub, select All data streams. To see the new data stream, refresh the All data streams page.

    For detailed steps, see View details of data streams in Fabric Real-Time hub.

To learn about consuming data streams, see the following articles: