How to configure MySQL Debezium Connectors to create a schema history topic in Azure Event Hub
I want to create a schema history topic in Azure Event Hub and use configure like below: The connector has not worked. Please help me modifying the configure for the connector working! Thanks!
How to send compressed Json to azure event hub and read in fabric pyspark streaming ?
I am sending compressed data into Event Hub to overcome 1 MB hard limit in Azure Event Hub. I also have to read this in Py-spark and update delta table. The compressed data send to Event hub is coming as null in Py-spark stream. How to read it? This is…
About Event Hub Billing Method
Hi Everyone According to the pricing page on Event Hub, one of the fact questions expresses the following: To understand this well, I want you to help me understand this based on the next scenarios. Both scenarios tell us about changing the capactity of…
Problem in send event EventHub using Tenant_id, client_id and secret_id
Hi, a have a problem to send events using EventHubs with tenant_id, client_id and secret_id. Unauthorized access. 'Send' claim(s) are required to perform this operation. Resource: 'sb://.... Error condition: amqp:unauthorized-access Using…
Event Hub is receiving event after 15s when using log-to-eventhub policy in APIM
I have exposed service on APIM. What I want is to send the request to Event Hub. When I try to use log-to-event-hub policy the request has been sent to Event Hub but it appears in Event Hub after 15s. I tried to do the same with send-request policy and…
Unable to update Cosmos DB documents with Hierarchical partition keys using Azure Stream Analytics job
I have document in Cosmos DB with Hierarchical partition keys. I want to patch certain property value. But while running the job I am getting this error👇 The streaming job failed: Stream Analytics job has validation errors: Error connecting to…
How to ingest CDC data from SQL DATA SYNC to Event hub?
Hi we have a scenario to implement real-time CDC from SQL DATA SYNC to Azure SQL DB. Which tools are good for near real-time processes and prices as well. By using data bricks we can merge stream data into delta lake. But How to ingest real-time…
Is there any way to automate Azure event hub metrics screenshot and posting to MS Teams channel which can run scheduled intervals of time
the below process is to be automated: connect to Azure and Gather the metrics screenshot from Azure resource ( Azure event hub - incoming vs outgoing messages ), Also select the range as last 6 hours. Take the screenshot and post it in MS Teams…
How retrieve an azure event hub schema given an eventhub namespace & eventhub
Hi, I am trying to retrieve an azure schema registry schema for an eventhub. I need to get the Avro schema for a specific topic/eventhub in the eventhub namespace. I see how to retrieve the Schema given a SchemaId using the SchemaRegistryAsyncClient,…
How are PUs assigned/distributed in azure event hub premium tier ?
Hi We have 15+ eventhub in a namespace, each namespace has varying partitions, the problem we are having are seeing is we are able to pull only 1TB per an hour, which the limits as per eventhub as higher than these. We are consuming using logstash…
How to deserialize an IDictionary Object load into an Azure EventHub
I am trying to deserialize data that is coming from Azure Event Hub using Kafka integration, the data is loaded into the EventHub using .net app and the message is an IDictionary object as shown below, after reading from the EventHub the data that is…
Azure Eventhub Limits
Hi Team, I have read the Microsoft document on the Azure Eventhub limits from "https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-quotas#basic-vs-standard-vs-premium-vs-dedicated-tiers" I see some discrepancies in this. As…
About Event Hub Premium Ingress limit
Hi, I was reading the docs, and acording to this doesn't have ingress limit. Hoever, by reading here I realized that in fact it has a ingress limit (5 to 10 MB/s, it's consfusing), but it's not that clear that Standard whit 1 MB/s Can you help undestand…
Upload CSV to Event Hub ¿is it posible?
Hi every one I've been making some test and realized that a given CSV payload is around a 35% size of the equivalent JSON payload. I want to propose my boss to send to Event Hub CSV and not JSON, and here are my questions: ¿is it posible? The Data…
Not able to forward data from Azure Event Hub to HTTP Webhook using Azure Portal
Dear Community, I am facing issue with Event Subscription from Azure Event hub using Azure portal. I think main problem is "Event- Hub" not sending event to the event grid topic. I have created the subscription on this topic and forwarding to…
Synapse Spark Pool(spark 3.1): unable to find class: org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginCallbackHandler
I am trying to test Kafka Consumer to read events from Azure EventHub and load it as DataStream in python using Azure Synapse Workspace. I need to do the auth using service principal and secret, please advise me on what I might be doing wrong or if I am…
Is there a configuration in EventHub to decompress gzipped messages automatically?
Hi, Messages ingested into EventHub in gzip format are being fetched in json format. I haven't found any configuration that might explain this. Could it be that Azure decompresses messages automatically? Is there a configuration relevant for this…
Azure Event Hubs time-sync related questions
Could you provide insights into how Azure Event Hubs synchronizes its time clock? Specifically, which NTP (Network Time Protocol) server does Azure Event Hubs utilize for time synchronization? Is the NTP server available publicly or privately?
Doubts about the SDK usage and implementation mechanism of eventhub
Use Azure SDK in China. Accessing the eventhub through AAD,In the code, when the ClientSecretCredential does not specify the China region, it can be seen through packet capture that there is access to the global endpoint. However, it is strange that the…
How to setup multiple consumers for an Azure Event Hub
Problem Statement : I have an event hub to which multiple apps would send the messages to . The load is very high (a million records per 5 hours ) . I have only one Azure function as an event hub consumer which consumes the message and inserts it…