Opplæring
Modul
Use Apache Flink on Azure HDInsight on AKS - Training
Use Apache Flink on Azure HDInsight on AKS.
Denne nettleseren støttes ikke lenger.
Oppgrader til Microsoft Edge for å dra nytte av de nyeste funksjonene, sikkerhetsoppdateringene og den nyeste tekniske støtten.
This tutorial shows you how to connect Apache Flink to an event hub without changing your protocol clients or running your own clusters. For more information on Event Hubs' support for the Apache Kafka consumer protocol, see Event Hubs for Apache Kafka.
In this tutorial, you learn how to:
Obs!
This sample is available on GitHub
To complete this tutorial, make sure you have the following prerequisites:
apt-get install default-jdk
to install the JDK.apt-get install maven
to install Maven.sudo apt-get install git
to install Git.An Event Hubs namespace is required to send or receive from any Event Hubs service. See Creating an event hub for instructions to create a namespace and an event hub. Make sure to copy the Event Hubs connection string for later use.
Now that you have the Event Hubs connection string, clone the Azure Event Hubs for Kafka repository and navigate to the flink
subfolder:
git clone https://github.com/Azure/azure-event-hubs-for-kafka.git
cd azure-event-hubs-for-kafka/tutorials/flink
Using the provided Flink producer example, send messages to the Event Hubs service.
Update the bootstrap.servers
and sasl.jaas.config
values in producer/src/main/resources/producer.config
to direct the producer to the Event Hubs Kafka endpoint with the correct authentication.
bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
client.id=FlinkExampleProducer
sasl.mechanism=PLAIN
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username="$ConnectionString" \
password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
Viktig
Replace {YOUR.EVENTHUBS.CONNECTION.STRING}
with the connection string for your Event Hubs namespace. For instructions on getting the connection string, see Get an Event Hubs connection string. Here's an example configuration: sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://mynamespace.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=XXXXXXXXXXXXXXXX";
To run the producer from the command line, generate the JAR and then run from within Maven (or generate the JAR using Maven, then run in Java by adding the necessary Kafka JAR(s) to the classpath):
mvn clean package
mvn exec:java -Dexec.mainClass="FlinkTestProducer"
The producer will now begin sending events to the event hub at topic test
and printing the events to stdout.
Using the provided consumer example, receive messages from the event hub.
Update the bootstrap.servers
and sasl.jaas.config
values in consumer/src/main/resources/consumer.config
to direct the consumer to the Event Hubs Kafka endpoint with the correct authentication.
bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
group.id=FlinkExampleConsumer
sasl.mechanism=PLAIN
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username="$ConnectionString" \
password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
Viktig
Replace {YOUR.EVENTHUBS.CONNECTION.STRING}
with the connection string for your Event Hubs namespace. For instructions on getting the connection string, see Get an Event Hubs connection string. Here's an example configuration: sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://mynamespace.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=XXXXXXXXXXXXXXXX";
To run the consumer from the command line, generate the JAR and then run from within Maven (or generate the JAR using Maven, then run in Java by adding the necessary Kafka JAR(s) to the classpath):
mvn clean package
mvn exec:java -Dexec.mainClass="FlinkTestConsumer"
If the event hub has events (for example, if your producer is also running), then the consumer now begins receiving events from the topic test
.
Check out Flink's Kafka Connector Guide for more detailed information about connecting Flink to Kafka.
To learn more about Event Hubs for Kafka, see the following articles:
Opplæring
Modul
Use Apache Flink on Azure HDInsight on AKS - Training
Use Apache Flink on Azure HDInsight on AKS.
Dokumentasjon
Integrate with Apache Kafka Connect - Azure Event Hubs
This article provides a walkthrough that shows you how to use Kafka Connect with Azure Event Hubs for Kafka.
Azure Event Hubs - Process Apache Kafka events - Azure Event Hubs
Tutorial: This article shows how to process Kafka events that are ingested through event hubs by using Azure Stream Analytics
Kafka Streams for Apache Kafka in Event Hubs on Azure Cloud - Azure Event Hubs
Learn about how to use the Apache Kafka Streams API with Event Hubs service on Azure Cloud.