How to load 2 models for ML online inference

Giuseppe 10 Reputation points
2024-06-02T16:06:42.4033333+00:00

Hello,

I have 2 pre-trained models I would like to deploy for inference using only one Azure ML managed online endpoint. One model would generate an input for the other model. The examples available on the learning platform and on GitHub all reference only one model per endpoint.

Is it possible to load 2 models under the same online managed endpoint? Do you have any reference guide on how to do it?

Thank you

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
2,960 questions
0 comments No comments
{count} vote

1 answer

Sort by: Most helpful
  1. AshokPeddakotla-MSFT 34,611 Reputation points
    2024-06-03T03:22:33.4133333+00:00

    Giuseppe Greetings & Welcome to Microsoft Q&A forum!

    Did you check these already? Create a multimodel deployment and Create a multimodel deployment using a custom container

    Also, see Azure Machine Learning SDK (v2) examples for more details,

    1 person found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.