Forecast energy and power demand with machine learning

Azure Machine Learning
Azure Data Factory
Power BI

Solution ideas

This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements.

Learn how Azure Machine Learning can help forecast spikes in demand for energy products and services.


Architecture diagram: using Azure services like Machine Learning in a solution that forecasts energy and power demand.

Download a Visio file of this architecture.


  1. Time series data can be stored in various formats, depending on its original source. Data can be stored as files within Azure Data Lake Storage or in tabular form in Azure Synapse or Azure SQL Database.
  2. Read: Azure Machine Learning (ML) can connect and read from such sources. Ingestion of time series data into Azure Machine Learning, enables automated machine learning (AutoML) to pre-process the data and to train and register a model.
  3. The first step within AutoML is configuration and preprocessing the time series data. In this step, the provided data is prepared for training. The data drives the following features and forecasted configurations:
    • Imputed missing values
    • Holiday and DateTime feature engineering
    • Lags and rolling windows
    • Rolling origin cross validation
  4. During the training stage, AutoML uses the preprocessed dataset to train, select, and explain the best forecasting model.
    • Model training: A wide range of machine learning models can be used, ranging from classical forecasting, deep neural networks, and regression models.
    • Model evaluation: The evaluation of models allows AutoML to assess the performance of each trained model, and it enables you to select the best performing model for deployment.
    • Explainability: AutoML provides explainability for the selected model, which enables you to better understand what features are driving the model outcomes.
  5. The model with best performance is registered in Azure Machine Learning using AutoML, which makes it available for deployment.
  6. Deploy: The model registered in Azure Machine Learning can be deployed, which provides a live endpoint that can be exposed for inferencing.
  7. The deployment can be done through Azure Kubernetes Service (AKS), while you run a Kubernetes-managed cluster where the containers are deployed from images that are stored in Azure Container Registry. Alternatively, Azure Container Instances can be used instead of AKS.
  8. Inference: Once the model is deployed, the inferencing of new data can be done via the available endpoint. Batch and near real-time predictions can be supported. The inference results can be stored as documents within Azure Data Lake Storage or in tabular form in Azure Synapse or Azure SQL Database.
  9. Visualize: The stored model results can be consumed through user interfaces, such as Power BI dashboards, or through custom-built web applications. The results are written to a storage option in a file or tabular format, then are properly indexed by Azure Cognitive Search. The model runs as batch inference and stores the results in the respective datastore.


Scenario details

The energy consumption and energy demand change over time. The monitoring of this change over time, results in time-series that can be utilized to understand patterns, and to forecast future behaviors. Azure Machine Learning can help forecast spikes in demand for energy products and services.

This solution is built on the Azure managed services:

These services run in a high-availability environment, patched and supported, allowing you to focus on your solution instead of the environment they run in.

Potential use cases

This solution is ideal for the energy industry.


This article is maintained by Microsoft. It was originally written by the following contributors.

Principal author:

Next steps

See the following product documentation:

Learn more: