Introduction

Completed

Train compute-intensive models using GPU compute in Azure Machine Learning.

Imagine you work as a data scientist for an insurance company. Whenever someone visits the hospital, they have to fill in an expense form to get reimbursed for any care they received. The forms are analyzed by the customer support department, after which the important information is filled into the system. The system then takes care of reimbursing the customers for their expenses.

The company wants to innovate and speed up this process by creating an application that extracts the information from the form and fills it into the system. You're asked to create a model that recognizes handwritten text and extract it. You want to train the model in Azure Machine Learning so that you can use scalable compute and easily deploy the model.

After this module, you'll be able to recognize compute-intensive workloads and choose the appropriate strategy to train those models in Azure Machine Learning.

Learning objectives

In this module, you'll learn:

  • How to train a model with GPUs in Azure Machine Learning.
  • When to use which GPU option.
  • How to distribute model training.