For a supply chain management firm, we need some advice on best (optimum) practices specific to azure.
The ultimate goal of the project is to monitor operational performance metrics in real time on power bi and keep all data in an environment where it can be analyzed and processed easily.
Facts, assumptions and expectations are as follows:
1- The company manages its operations via an in-house developed web application backed by a sql database running in a virtual environment.
2- This relational database is subject to a very heavy read/write operation 18/7 and is NOT considered to be migrated to Azure. Moreover, the size of daily transnational data is up to 5 GB.
3- It has been decided to use power bi as the final reporting/dash-boarding tool.
4- In the near feature, the company is willing to run some machine learning algorithms on top of this big data and support its mid-term business strategies.
1- Transferring (syncing) data from on-premise sql to azure environment to cater for real time monitoring requirements has limits. Consider the fact that we have a transaction and heavy load database and records are updated retrospectively, which makes the syncing much harder. What would be your suggestion to cater for real-time expectations.
2- Power bi premium includes azure analysis services (AAS) behind the scenes. However, we are under the assumption that this AAS is not running on the best hardware. Thus, this leads us to consider using a stand alone AAS and use it as the data source on Power Bi. What would be your suggestions on this matter?
3- The following pipeline and or tools seem to be the plan. on-premise sql -> (via data factory) -> azure sql on azure data lake -> AAS -> Power bi . Any suggestion on this?
Please feel free to ask if anything is unclear.