Your Current Plan Overview
The plan is to pipe the data from ADX to ASA jobs on Azure Event Hubs for aggregation, then persisting it as an AVRO file in ADLS for historical retention. The question of using Azure Synapse Analytics for daily Upserts and incremental loading from SQL dedicated pool into the staging tables that will be merged into the dedicated tables by Spark notebooks before building idempotent pipelines and using the type 1 Slowly Changing Dimension (SCD) approach is under consideration.
You plan to create two connections to Power BI: Snapchat history on Synapse and another for daily updates from ASA jobs.
What do I recommend ?
Your data plane contains multiple steps and services as well as latency and complexity issues. Think about removing steps or integrating processes where it makes sense. For instance, Azure Synapse comes with native integration with both ADLS and Power BI, the possibility of simplified architecture by cutting dependencies on ASA for aggregation may come true, provided Synapse can process aggregating workloads directly.
Azure SQL Synapse Link (for Azure Cosmos DB and other services) provides futuristic analytical data streams without disrupting your transactional workloads. The approach of using "link" services to cut down latency and complexity in data pipelines could be implemented only if the integrations in a similar way exist for the services you do use now or are planned for the near future.
Make sure that the chosen architecture is scalable enough to fit with your data and performance requirements. An example could be to avoid using Spark pools in Synapse for simple transformations provided by ASA jobs or SQL pool stored procedures that might be more resource-intensive than necessary.
If you need real-time dashboards, favor Power BI’s DirectQuery mode for the ASA job outputs since it will give you the ability to query the data directly instead of importing it into Power BI. It is possible to reduce latency for the dashboard updates, although it is highly recommended to assess the effect on query performance and expenses. For historical data analysis, the Import mode may be a more suitable option, especially when the data is large and the updates are not often.
With Azure Monitor and other tools for performance monitoring, monitor your data pipeline and Power BI dashboards all the time. Seek out data flow bottlenecks or inefficiencies, query performance issues, and resource wasting. Periodically audit and modify your data models, queries, and pipeline structures to satisfy timeliness and complexity.