AI at the edge with Azure Stack Hub

Azure Container Registry
Azure Kubernetes Service (AKS)
Azure Machine Learning
Azure Stack Hub

Solution ideas

This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements.

This architecture shows how you can bring your trained AI model to the edge with Azure Stack Hub and integrate it with your applications for low-latency intelligence.


Architecture diagram showing an AI -enabled application that's running at the edge with Azure Stack Hub.

Download a Visio file of this architecture.


  1. Data is processed using Azure Data Factory, to be placed on Azure Data Lake.
  2. Data from Azure Data Factory is placed into the Azure Data Lake Storage for training.
  3. Data scientists train a model using Azure Machine Learning. The model is containerized and put into an Azure container registry.
  4. The model is deployed to a Kubernetes cluster on Azure Stack Hub.
  5. The on-premises web application can be used to score data that's provided by the end user, to score against the model that's deployed in the Kubernetes cluster.
  6. End users provide data that's scored against the model.
  7. Insights and anomalies from scoring are placed into a queue.
  8. A function app gets triggered once scoring information is placed in the queue.
  9. A function sends compliant data and anomalies to Azure Storage.
  10. Globally relevant and compliant insights are available for consumption in Power BI and a global app.
  11. Feedback loop: The model retraining can be triggered by a schedule. Data scientists work on the optimization. The improved model is deployed and containerized as an update to the container registry.


Key technologies used to implement this architecture:

Scenario details

With the Azure AI tools, edge, and cloud platform, edge intelligence is possible. The next generation of AI-enabled hybrid applications can run where your data lives. With Azure Stack Hub, bring a trained AI model to the edge, integrate it with your applications for low-latency intelligence, and continuously feedback into a refined AI model for improved accuracy, with no tool or process changes for local applications. This solution idea shows a connected Azure Stack Hub scenario, where edge applications are connected to Azure. For the disconnected-edge version of this scenario, see the article AI at the edge - disconnected.

Potential use cases

There's a wide range of Edge AI applications that monitor and provide information in near real-time. Areas where Edge AI can help include:

  • Security camera detection processes.
  • Image and video analysis (the media and entertainment industry).
  • Transportation and traffic (the automotive and mobility industry).
  • Manufacturing.
  • Energy (smart grids).

Next steps

For more information about the featured Azure services, see the following articles and samples:

See the following related architectures: