Events
Mar 31, 11 PM - Apr 2, 11 PM
The biggest Fabric, Power BI, and SQL learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
This article describes how to connect your Azure Stream Analytics job directly to confluent cloud kafka as an input.
Azure Stream Analytics requires you to configure managed identity to access key vault. You can configure your stream analytics job to use managed identity by navigating to the Managed Identity tab on the left side under Configure.
Azure stream analytics is a librdkafka-based client, and to connect to confluent cloud, you need TLS certificates that confluent cloud uses for server authentication. Confluent cloud uses TLS certificates from Let’s Encrypt, an open certificate authority (CA).
Download the ISRG Root X1 certificate in PEM format on the site of LetsEncrypt.
Azure Stream Analytics integrates seamlessly with Azure Key vault to access stored secrets needed for authentication and encryption. Your Azure Stream Analytics job connects to your Azure Key vault using managed identity to ensure a secure connection and avoid the exfiltration of secrets. To use the certificate you downloaded, you must upload it to key vault first.
To upload certificates, you must have "Key Vault Administrator" access to your Key vault. Follow the following to grant admin access:
Note
You must have "Owner" permissions to grant other key vault permissions.
In your key vault, select Access control (IAM).
Select Add > Add role assignment to open the Add role assignment page.
Assign the role using the following configuration:
Setting | Value |
---|---|
Role | Key Vault Administrator |
Assign access to | User, group, or service principal |
Members | <Your account information or email> |
Important
You must have "Key Vault Administrator" permissions access to your Key vault for this command to work properly You must upload the certificate as a secret. You must use Azure CLI to upload certificates as secrets to your key vault. Your Azure Stream Analytics job will fail when the certificate used for authentication expires. To resolve this, you must update/replace the certificate in your key vault and restart your Azure Stream Analytics job.
Make sure you have Azure CLI configured and installed locally with PowerShell. You can visit this page to get guidance on setting up Azure CLI: Get started with Azure CLI
Login to Azure CLI:
az login
Connect to your subscription containing your key vault:
az account set --subscription <subscription name>
For example:
az account set --subscription mymicrosoftsubscription
The following command can upload the certificate as a secret to your key vault:
The <your key vault>
is the name of the key vault you want to upload the certificate to. <name of the secret>
is any name you want to give to your secret and how it shows up in the key vault. <file path to certificate>
is the path to where the certificate your certificate is located. You can right-click and copy the path to the certificate.
az keyvault secret set --vault-name <your key vault> --name <name of the secret> --file <file path to certificate>
For example:
az keyvault secret set --vault-name mykeyvault --name confluentsecret --file C:\Users\Downloads\isrgrootx1.pem
For your Azure Stream Analytics job to read the secret in your key vault, the job must have permission to access the key vault. Use the following steps to grant special permissions to your stream analytics job:
In your key vault, select Access control (IAM).
Select Add > Add role assignment to open the Add role assignment page.
Assign the role using the following configuration:
Setting | Value |
---|---|
Role | Key vault secrets user |
Managed identity | Stream Analytics job for System-assigned managed identity or User-assigned managed identity |
Members | <Name of your Stream Analytics job> or <name of user-assigned identity> |
Important
To configure your Kafka cluster as an input, the timestamp type of the input topic should be LogAppendTime. The only timestamp type Azure Stream Analytics supports is LogAppendTime. Azure Stream Analytics supports only numerical decimal format.
In your stream analytics job, select Inputs under Job Topology
Select Add input > Kafka to open the Kafka New input configuration blade.
Use the following configuration:
Note
For SASL_SSL and SASL_PLAINTEXT, Azure Stream Analytics supports only PLAIN SASL mechanism.
Property name | Description |
---|---|
Input Alias | A friendly name used in queries to reference your input |
Bootstrap server addresses | A list of host/port pairs to establish the connection to your confluent cloud kafka cluster. Example: pkc-56d1g.eastus.azure.confluent.cloud:9092 |
Kafka topic | The name of your kafka topic in your confluent cloud kafka cluster. |
Security Protocol | Select SASL_SSL. The mechanism supported is PLAIN. |
Consumer Group Id | The name of the Kafka consumer group that the input should be a part of. It will be automatically assigned if not provided. |
Event Serialization format | The serialization format (JSON, CSV, Avro, Parquet, Protobuf) of the incoming data stream. |
Important
Confluent Cloud supports authentication using API Keys, OAuth, or SAML single sign-on (SSO). Azure Stream Analytics does not support authentication using OAuth or SAML single sign-on (SSO). You can connect to confluent cloud using an API Key that has topic-level access via the SASL_SSL security protocol. To authenticate to confluent cloud you will need to use SASL_SSL and configure your job to authenticate to confluent cloud using your API key.
Use the following configuration:
Setting | Value |
---|---|
Username | confluent cloud API key |
Password | confluent cloud API secret |
Key vault name | name of Azure Key vault with uploaded certificate |
Truststore certificates | name of the Key Vault Secret that holds the ISRG Root X1 certificate |
Save your configuration. Your Azure Stream Analytics job validates using the configuration provided. A successful connection shows in the portal if your stream analytics can connect to your kafka cluster.
Note
For direct help with using the Azure Stream Analytics Kafka input, please reach out to askasa@microsoft.com.
Events
Mar 31, 11 PM - Apr 2, 11 PM
The biggest Fabric, Power BI, and SQL learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayTraining
Module
Perform advanced streaming data transformations with Apache Spark and Kafka in Azure HDInsight
Certification
Microsoft Certified: Azure for SAP Workloads Specialty - Certifications
Demonstrate planning, migration, and operation of an SAP solution on Microsoft Azure while you leverage Azure resources.