How to load 2 models for ML online inference

Giuseppe 5 Reputation points


I have 2 pre-trained models I would like to deploy for inference using only one Azure ML managed online endpoint. One model would generate an input for the other model. The examples available on the learning platform and on GitHub all reference only one model per endpoint.

Is it possible to load 2 models under the same online managed endpoint? Do you have any reference guide on how to do it?

Thank you

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
2,657 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. AshokPeddakotla-MSFT 29,816 Reputation points

    Giuseppe Greetings & Welcome to Microsoft Q&A forum!

    Did you check these already? Create a multimodel deployment and Create a multimodel deployment using a custom container

    Also, see Azure Machine Learning SDK (v2) examples for more details,

    1 person found this answer helpful.