Oil and gas tank level forecasting

Azure Data Factory
Azure Event Hubs
Azure Machine Learning
Azure Stream Analytics
Azure Synapse Analytics

Solution ideas

This article is a solution idea. If you'd like us to expand the content with more information, such as potential use cases, alternative services, implementation considerations, or pricing guidance, let us know by providing GitHub feedback.

Today, most facilities operate reactively to problems in tank levels. This reactivity often leads to spills, emergency shutdowns, expensive remediation costs, regulatory issues, costly repairs, and fines. Tank level forecasting helps manage and abate these and other problems.

Architecture

Architecture diagram shows data into Azure Event Hubs / Azure Synapse. Azure Stream Analytics analyzes data while Power BI monitors oil tank level.

Download a Visio file of this architecture.

Dataflow

  1. The data feeds into the Azure Event Hubs and Azure Synapse Analytics service as data points or events that will be used in the rest of the solution flow.
  2. Azure Stream Analytics analyzes the data to provide near real-time analytics on the input stream from the event hub and directly publish to Power BI for visualization.
  3. Azure Machine Learning is used to make forecast on the tank level of particular region given the inputs received.
  4. Azure Synapse Analytics is used to store the prediction results received from Azure Machine Learning. These results are then consumed in the Power BI dashboard.
  5. Azure Data Factory handles orchestration, and scheduling of the hourly model retraining.
  6. Finally, Power BI is used for results visualization, so that users can monitor the tank level from a facility in real time and use the forecast level to prevent spillage.

Components

Scenario details

The tank level forecasting process starts at the well input. Oil is measured as it comes into the facility via meters and is sent to tanks. Levels are monitored and recorded in tanks during the refining process. Oil, gas, and water output are recorded via sensors, meters, and records. Forecasts are then made using data from the facility; for example, forecasts can be made every 15 minutes.

Azure Cognitive Services is adaptable and can be customized to meet different requirements that facilities and corporations have.

Potential use cases

This solution is ideal for the energy, automotive, and aerospace industries.

Forecasts are created by harnessing the power of real-time and historical data readily available from sensors, meters, and records, which help with the following scenarios:

  • Prevent tank spillage and emergency shutdowns
  • Discover hardware malfunction or failure
  • Schedule maintenance, shutdowns, and logistics
  • Optimize operations and facility efficiency
  • Detect pipeline leaks and slugging
  • Reduce costs, fines, and downtime

Deploy this scenario

For more details on how this solution is built, visit the solution guide in GitHub. This guide was built with a similar, previous set of Azure AI services.

This solution provides advanced analytics tools through Microsoft Azure - data ingestion, data storage, data processing, and advanced analytics components - all of the essential elements for building a tank level forecasting solution.

This solution combines several Azure services to provide powerful advantages. Event Hubs collects real-time tank level data. Stream Analytics aggregates the streaming data and makes it available for visualization. Azure Synapse Analytics stores and transforms the tank level data. Machine Learning implements and executes the forecasting model. Power BI visualizes the real-time tank level and the forecast results. Finally, Data Factory orchestrates and schedules the entire data flow.

The 'Deploy' button will launch a workflow. This process deploys an instance of the solution within a Resource Group in the Azure subscription that you specify. The solution includes multiple Azure services (described below) along with a web job that simulates data so that immediately after deployment you have a working end-to-end solution.

After deployment, see the post deployment instructions in GitHub.

Next steps

Product documentation:

Microsoft Learn modules: