Create a Managed ML Inference Endpoint and deployment using Terraform

Pietro Bolcato 10 Reputation points
2023-02-06T22:05:50.66+00:00

I would like to use Terraform to create a Managed ML Inference Endpoint and its corresponding deployment. I know the options to do it with CLI, python SDK and ARM template but I can't find a way to do it with Terraform, which feels very strange.

On the Terraform docs, I only find how to create a ML Workspace and Computing instances. What am I missing? Is it really not possible to do it using Terraform?

Thank you so much in advance! Appreciate a lot your help!

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
3,116 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Pietro Bolcato 10 Reputation points
    2023-02-07T13:26:19.1+00:00

    I see thanks a lot @romungi-MSFT ! This is indeed strange but it is what it is.

    Do you think I can use azurerm_resource_group_template_deployment to create a ML managed online endpoint using an ARM template? It's not ideal because is mixing ARM and TF but at the least I could manage all the resources from terraform


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.