question

KuikenPhilipvan-5195 avatar image
1 Vote"
KuikenPhilipvan-5195 asked WillBlenkhorn-7568 answered

Manually deploy azureml docker image to app service

I am trying to deploy an Azure ML webservice endpoint to an app service. I prefer to use an app service instead of ACI or AKS endpoints, because an app service has the benefit of scaling and deploying SSL certificates, withouth having to maintain and setup a fully secured AKS cluster. ACI is not advised to use in a production environment.

The docker image I am trying to deploy is the same docker image produced with ACI or AKS deployment. To do so I ran 'az model deploy' command for an ACI endpoint, which builds and packages all resources needed for the endpoint into an Docker image stored in ACR. I set up an app service which pulls this docker image from an ACR. Copying the startup command runsvdir /var/runit and azureml environment variables from a working ACI example should give an working webservice endpoint as app service. Unfortunately, I am struggling with an error that the azureml-app directory could not be found, which contains the model, model code and model execution scripts.

 2021-03-05T08:38:53.803756130Z 2021-03-05T08:38:53,786438768+00:00 - gunicorn/run 
 2021-03-05T08:38:53.806168222Z ./run: line 13: cd: /var/azureml-app: No such file or directory

Pulling the docker image from the ACR and inspecting manually indeed confirmed there is no /var/azureml-app directory. But connecting to the ACI and inspecting the running container has a directory /var/azureml-app. For me it's not clear when this folder and data is pulled into the image/container. I would expect during 'az ml deploy' command which builds tjhe docker image, but clearly this is not the case. This is the only thing preventing me from having a working app service, does anyone have an idea how to solve this?

More info about the docker image build by AzureML Docker Image AzureML


azure-webappsazure-machine-learningazure-container-instances
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

ramr-msft avatar image
0 Votes"
ramr-msft answered

@KuikenPhilipvan-5195 Thanks for the question. We can perform local inference, If we use Model.deploy to deploy the model, SDK won’t attach the azureml-app folder to the image. Therefore, if we pull the image from ACR and docker run your image locally, you have to place at least your score.py and model.pkl under azureml-app and use the below example command to start the container:

docker run -dit -p 5001:5001 -e AZUREML_MODEL_DIR=azureml-models/<model_name>/<version> -e AZUREML_ENTRY_SCRIPT=score.py --mount src="<local_filepath>",target=/var/azureml-app,type=bind <image_id> runsvdir /var/runit

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

WillBlenkhorn-7568 avatar image
0 Votes"
WillBlenkhorn-7568 answered

Hi,

I am also interested in doing local inference. I have made an inference docker image using the SDK, following this guide https://docs.microsoft.com/en-us/azure/machine-learning/how-to-deploy-local. Though I want to run the inference just using the docker image. Your command above:

docker run -dit -p 5001:5001 -e AZUREML_MODEL_DIR=azureml-models/<model_name>/<version> -e AZUREML_ENTRY_SCRIPT=score.py --mount src="<local_filepath>",target=/var/azureml-app,type=bind <image_id> runsvdir /var/runit

I dont fully understand, what is the point of the --mount and what should be inputted for <local_filepath>. Also it's so hard to find documentation on this. Is there somewhere in the Azure docs which describes the above command for running the docker image for inference?

Any help is much appreciated, ;).

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.