Quickstart: Data streaming with Event Hubs using the Kafka protocol
This quickstart shows how to stream into Event Hubs without changing your protocol clients or running your own clusters. You learn how to use your producers and consumers to talk to Event Hubs with just a configuration change in your applications.
Note
This sample is available on GitHub
Prerequisites
To complete this quickstart, make sure you have the following prerequisites:
- Read through the Event Hubs for Apache Kafka article.
- An Azure subscription. If you don't have one, create a free account before you begin.
- Create a Windows virtual machine and install the following components:
- Java Development Kit (JDK) 1.7+.
- Download and install a Maven binary archive.
- Git
Create an Event Hubs namespace
When you create an Event Hubs namespace, the Kafka endpoint for the namespace is automatically enabled. You can stream events from your applications that use the Kafka protocol into event hubs. Follow step-by-step instructions in the Create an event hub using Azure portal to create an Event Hubs namespace. If you're using a dedicated cluster, see Create a namespace and event hub in a dedicated cluster.
Note
Event Hubs for Kafka isn't supported in the basic tier.
Send and receive messages with Kafka in Event Hubs
Clone the Azure Event Hubs for Kafka repository.
Navigate to azure-event-hubs-for-kafka/quickstart/java/producer.
Update the configuration details for the producer in src/main/resources/producer.config as follows:
bootstrap.servers=NAMESPACENAME.servicebus.windows.net:9093 security.protocol=SASL_SSL sasl.mechanism=PLAIN sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
Important
Replace
{YOUR.EVENTHUBS.CONNECTION.STRING}
with the connection string for your Event Hubs namespace. For instructions on getting the connection string, see Get an Event Hubs connection string. Here's an example configuration:sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://mynamespace.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=XXXXXXXXXXXXXXXX";
Run the producer code and stream events into Event Hubs:
mvn clean package mvn exec:java -Dexec.mainClass="TestProducer"
Navigate to azure-event-hubs-for-kafka/quickstart/java/consumer.
Update the configuration details for the consumer in src/main/resources/consumer.config as follows:
bootstrap.servers=NAMESPACENAME.servicebus.windows.net:9093 security.protocol=SASL_SSL sasl.mechanism=PLAIN sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
Important
Replace
{YOUR.EVENTHUBS.CONNECTION.STRING}
with the connection string for your Event Hubs namespace. For instructions on getting the connection string, see Get an Event Hubs connection string. Here's an example configuration:sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://mynamespace.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=XXXXXXXXXXXXXXXX";
Run the consumer code and process events from event hub using your Kafka clients:
mvn clean package mvn exec:java -Dexec.mainClass="TestConsumer"
If your Event Hubs Kafka cluster has events, you'll now start receiving them from the consumer.
Next steps
In this article, you learned how to stream into Event Hubs without changing your protocol clients or running your own clusters. To learn more, see Apache Kafka developer guide for Azure Event Hubs.
Feedback
Submit and view feedback for