Поделиться через


Azure Batch Preview

I am excited to announce the preview of Azure Batch, a new Azure platform service that provides job scheduling and large-scale resource management for developing and delivering Big Compute applications native in the cloud. 

What is Batch?

Azure Batch originated from seeing our customers repeatedly building their own job queues, schedulers, and VM scaling for applications that used a parallel processing pattern. Developing and supporting this job scheduling infrastructure was time-consuming and took resources away from work on their core service.                                   

With Azure Batch, you submit jobs. We start the VMs, run your tasks, handle any failures, and then shut things down as work completes. This batch computing or parallel processing pattern is used across industries and applications. It’s not just for high performance computing (HPC), but performance is at our core.  

Common uses for batch computing in the cloud include media transcoding and rendering, image analysis and processing, financial risk calculation and Monte Carlo simulations, engineering simulations and stress analysis, weather forecasting, energy exploration and software test execution.  

The batch pattern is widely used and well suited for the cloud, where resources can be provisioned on demand at very high scale and used efficiently to process a lot of work. The programming model is simple—call an executable or script with a set of parameters. There can be a single task per job or hundreds of thousands. The batch pattern makes it easy to use the cloud and take advantage of scale and reliability without needing to learn about managing multiple instances, fault domains, error handling, and other concepts that our service handles for you.  

Batch on Azure Today

Azure Batch is used in production today by several Microsoft teams. One example is Azure Media Services which uses Batch for its audio and video encoding. Media Services decides which jobs to run, and submits the encode tasks to Batch. Auto-scaling rules help Media Services uses their VMs efficiently, limit idle capacity, and deliver different tiers of service.                   

Azure Batch is also used by the Azure engineering teams to test Azure itself. A common test execution framework is used to manage and schedule tests, which are executed by Batch. Both services operate at high scale and require high reliability.  

Using Azure Batch

We invite you to join our public preview by registering for the preview, learning about the service, and sharing your feedback on the experience and capabilities that you would like to see.                           

This preview is targeted at developers to introduce the service and support early adopters using Batch. The API supports two styles of integration. The first is for developers producing a service that requires batch processing and need fine-grained control over running tasks and the VMs the tasks run on. The second is a higher-level abstraction for integrating batch applications with Azure. This includes parallel or cluster applications, as well as any work that you want to execute in a batch pattern in the cloud. The Azure Batch API supports REST, a C# object model, and Python.  

We call the experience for integrating applications Batch Apps. These capabilities come from GreenButton, a company that Microsoft acquired in May. The API makes it easy to build plugins for desktop applications to submit jobs to a “cluster” in the cloud. Azure Batch then splits the job into one or more tasks to process the application. The job definition can describe data that should be moved to the cloud for the job, how the data is stored and copied to compute VMs, and then results merged and returned

Roadmap

Over the next months we will be adding additional features including application lifecycle management, rich monitoring, role-based access for submitting jobs and service management, custom VM images, and Linux. We will also be moving the Batch Apps management UI to the new Azure portal.                                                            

We look forward to working with you on Azure Batch, and welcome your feedback and suggestions.  

Getting Started

Sign up for the preview

Learn about Batch

Create an account in the portal once approved for the preview

  

Alex Sutton

Group Program Manager, Azure Big Compute

Comments

  • Anonymous
    January 01, 2003
    The comment has been removed
  • Anonymous
    January 01, 2003
    The comment has been removed
  • Anonymous
    January 01, 2003
    @Joe M.
    1. As a service in public preview, we don't have SLA for VM start up time. Although the typical start up time is around 5 to 10 minutes - depending on how many VM you allocate.
    2. When a pool or VM is deleted, the VM will be destroyed. Thus they need to be recreated next time.

    Please use Azure Batch forum if you have further questions - https://social.msdn.microsoft.com/forums/azure/en-US/home?forum=azurebatch.
  • Anonymous
    October 28, 2014
    Azure Batch is a new Azure platform service that provides job scheduling and large-scale resource management
  • Anonymous
    October 29, 2014
    28 октября стартовала конференция TechEd Europe 2014 , в рамках которой было сделано несколько ключевых
  • Anonymous
    October 30, 2014
    28 октября стартовала конференция TechEd Europe 2014 , в рамках которой было сделано несколько ключевых
  • Anonymous
    October 30, 2014
    このポストは、10 月 28 日に投稿された Exciting updates to Microsoft Azure at TechEd Europe, enabling simplicity, scale
  • Anonymous
    March 20, 2015
    Would you have more details on:
    1. What is the startup time for a batch vm in the pool?
    2. Do the batch VMs get shutdown when stopped (ie completely deallocated or deleted) so that they need to be recreated when starting up again?
  • Anonymous
    February 14, 2016
    Please add the Linux functionality. Many researchers use Linux for computation.
  • Anonymous
    March 16, 2016
    @I A, thanks for you query. We have announced our plan of supporting Linux on Batch at SC15. We are actively working on that and will share the service as soon as it's ready for customer adoption.