Automated machine learning model deployment issue

Hidde 101 Reputation points
2021-07-27T07:40:04.967+00:00

So I'm having an issue with setting up an endpoint for a machine learning model which was trained using Azure AutoML. When I try to test the deployed model, I get an error saying that the service is temporarily unavailable. After looking online, I found that this might happen because of an error in the run() function in the entry script.

When I try to test the entry script on a notebook in Azure ML studio, on a fresh compute instance, there are two problems:
First I get the error: AttributeError: 'MSIAuthentication' object has no attribute 'get_token'
Which is solved by running: pip install azureml-core

Then I get the error: ModuleNotFoundError: No module named 'azureml.automl.runtime'
Which I try to solve using: pip install azureml-automl-runtime
But this throws a lot of incompatibility errors during the installation. When I then try to run the entry script I get an error with the message: "Failed while applying learned transformations."

So I setup a new virtual environment on my local machine in which I only installed azure-automl-runtime. Using that setup the entry script works perfectly fine. So I created a custom environment in Azure ML studio using the conda file of that local virtual environment. Unfortunatly I still get the error "service temporarily unavailable" when trying to test the endpoint.

I have a feeling the default Azure ML containers are incompatible with azureml-automl-runtime, since installing this on a ML studio notebook also throws a lot of errors.

I feel like there should be an elegant way to deploy an AutoML model, am I doing something wrong here?

Update: I found out I didn't change the environment for the endpoint, so that is why I was getting the same error probably. When using the custom environment I got errors from gunicorn, so I also added that package to the environment. Now I get the following error:

      File "/var/azureml-server/entry.py", line 1, in <module>
    import create_app
  File "/var/azureml-server/create_app.py", line 4, in <module>
    from routes_common import main
  File "/var/azureml-server/routes_common.py", line 39, in <module>
    from azure.ml.api.exceptions.ClientSideException import ClientSideException
ModuleNotFoundError: No module named 'azure.ml'

So what do I install to fix this? Is there a list somewhere of required packages for an ML model endpoint?

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
2,889 questions
{count} votes

Accepted answer
  1. Hidde 101 Reputation points
    2021-07-28T09:19:56.087+00:00

    I managed to fix the issue with the environment by just adding everything that would throw an error. Then I found out the return value has to be a json/dict object, which if not done throws the exact same 'service temporarily unavailable' error.

    But my issue with the confusing curated environments and azureml-automl-runtime in ML studio notebooks remain. Maybe this is worth looking into @Ramr-msft .

    1 person found this answer helpful.
    0 comments No comments

1 additional answer

Sort by: Most helpful
  1. Ramr-msft 17,736 Reputation points
    2021-07-27T13:52:34.27+00:00

    @Hidde Thanks, Can you try this notebook for deployment and if that works for you (it should), compare with your code?
    https://github.com/CESARDELATORRE/Easy-AutoML-MLOps/blob/master/notebooks/5-automl-model-service-deployment-and-inference/automl-model-service-deployment-and-inference-safe-driver-classifier.ipynb

    You’ll first need to train and register the model with this previous notebook using a pipeline:
    https://github.com/CESARDELATORRE/Easy-AutoML-MLOps/blob/master/notebooks/4-automlstep-pipeline-run/automlstep-pipeline-run-safe-driver-classifier.ipynb

    You can also use the notebook with a simple AutoML remote run, but you might need to change the name of the model when registering it in the Workspace since it’s a different name to what the deployment notebook is using:
    https://github.com/CESARDELATORRE/Easy-AutoML-MLOps/blob/master/notebooks/3-automl-remote-compute-run/automl-remote-compute-run-safe-driver-classifier.ipynb


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.