What components and how would I configure them to handle the following use case?

McCann, J (John) 6 Reputation points
2024-08-27T18:59:48.39+00:00

I need to provide a data source that is updated in real time for access by my clients. 

Assume that my system of record is generating two streams of events for consumption,  1) API Message Request and 2) API Message Response. The number of event streams will grow.

The client can access data in event stream 1, event stream 2, or both event streams.  The event records within that stream will contain the client ID. The solution must match claims containing the client ID from the auth token against the client ID field in the event records to authorize access to the event records.  

The solution must produce events in sequence and maintain the record read marker by client.

I need a backup for event stream data that ages off or where there is a client issue with his local client data. 

Do I implement Event Hub plus an Event grid to maintain the record sequence and read marker per client? 

I have a concern about the number of entities (partitions, consumer groups, service bus topics, whatever) that I will need to maintain.

Azure Event Hubs
Azure Event Hubs
An Azure real-time data ingestion service.
648 questions
Azure Event Grid
Azure Event Grid
An Azure event routing service designed for high availability, consistent performance, and dynamic scale.
396 questions
{count} vote

1 answer

Sort by: Most helpful
  1. NIKHILA NETHIKUNTA 3,270 Reputation points Microsoft Vendor
    2024-08-28T07:15:35.38+00:00

    Hi @McCann, J (John)
    Thank you for the question and using Microsoft Q&A platform.

    Based on your requirements, Azure Event Hubs would be a good fit for your use case. Event Hubs is a real-time event stream engine that can handle millions of events per second, making it ideal for scenarios where you need to provide a data source that is updated in real-time for access by your clients.

    Event Hubs allows you to create multiple event streams, which can be consumed by your clients based on their needs. You can also use Event Hubs Capture feature to automatically store events in Azure Storage or Azure Data Lake for backup purposes.

    To maintain the record sequence and read marker per client, you can use consumer groups in Event Hubs. Consumer groups enable multiple consuming applications to each have a separate view of the event stream, and to read the stream independently at their own pace and with their own offsets. This means that each client can have their own consumer group, which will maintain their own read marker.

    Regarding your concern about the number of entities that you will need to maintain, the number of entities required will depend on the scale of your application. However, Event Hubs is designed to handle large-scale event streaming scenarios, so it should be able to handle your needs.

    In summary, Azure Event Hubs with Event Hubs Capture and consumer groups can provide a solution for your requirements of real-time event streaming, backup for event stream data, and maintaining record sequence and read marker per client.

    https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-features#capture
    https://learn.microsoft.com/en-us/azure/stream-analytics/event-hubs-parquet-capture-tutorial
    https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-features#consumer-groups

    Hope this helps. Do let us know if you have any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.