Batch Inference using Azure Machine Learning

In this episode we will cover a quick overview of new batch inference capability that allows Azure Machine Learning users to get inferences on large scale datasets in a secure, scalable, performant and cost-effective way by fully leveraging the power of cloud. 

[00:21] Context on Inference

[02:00] Handling High Volume Workloads

[03:05] ParallelRunStep Intro

[03:53] Support for Structured and Unstructured data

[04:14] – Demo walkthrough

[06:17] – ParallelRunStep Config

[07:40] – Pre and Post Processing


The AI Show's Favorite links: