Hello @Cloud Texavie ,
Welcome to the Microsoft Q&A platform.
After intensive research effort, unfortunately I could not a find recommending way to stream real-time data from Azure Databricks to a custom web app deployed with App Service.
Here are the couple of alternatives:
If Spark structured streaming (micro-batching), it's perhaps best to write the processed data to a low latency key-value/doc store like Cosmos DB. The webapp can then read from there but I've not seen any other architectures in production.
If you are using Kafka/Confluent for the ingestion, and don't really require Spark streaming component behind it, the confluent ecosystem does suggest the use of http sink connector: https://www.confluent.io/blog/webify-event-streams-using-kafka-connect-http-sink/
One can also use Kafka + Spark streaming where Spark can write back the processed data back to another topic in Kafka and then use the http sink.
That all said, all these options really depend on many different factors - SLAs, use-cases, cost considerations, performance/scalability, ease of operations etc.
Structured streaming with Azure Databricks into Power BI & Cosmos DB
Stream processing with Azure Databricks
Hope this helps. Do let us know if you any further queries.
Please don’t forget to
Accept Answer and
Up-Vote wherever the information provided helps you, this can be beneficial to other community members.