Azure ML Compute Instance Times Out During File Upload

Jason Lunder 1 Reputation point

I have deployed a gpu-enabled AZ ML Compute instance and am running a custom docker container on top of it. This custom docker container uses bentoml to receive, batch, and manage inference requests, with various endpoints for doing so. I have exposed the necessary ports so that the endpoints are available to send requests to. There are 2 major types of request endpoints involve: first, an endpoint where a path to a file that the compute/container have access to through a volume mount/storage mounting is sent as the data, and the container endpoint preforms inference on the file that path points to. This works just fine. Second is where the actual file is uploaded to the endpoint. The files in question are quite large image files, ~300-500mb. This second method has an issue: the file never reaches the endpoint. I attached a remote debugger and found the entrypoint of the data, and it is never reached in the case of the image upload, while it is in the case of the path upload. I then replicated the container on a local compute, and repeated the same scenario, in which the container was able to receive and handle the image upload with no issue. Additionally, when the image is more than 500mb, when attempting to upload to the container on the ml compute instance, I get an "entity too large" error, that I do not get when doing the same thing on a local compute instance.

I have been unable to find any definite documentation of the limitations on file upload size/speed for ML compute instances, all that I have been able to find pertains to azure apps and to ml online endpoints (which could be an alternative option), but not to ml compute instances. Is there such a limit? If so, what is it?

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
1,847 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. YutongTie-MSFT 30,771 Reputation points

    Hello @Jason Lunder

    Thanks for using Microsoft Q&A and sorry to hear your experience is not smooth.

    Uploading file directly to Azure ML compute instance is acutualy not a suggested way due to security limitation from Ngnix (Ngnix is one of our environment here ) - Restricting file upload size is useful to prevent some types of denial-of-service (DOS) attacks and many other related issues.

    The official limitation of upload suggests from Ngnix is only 1MB, for Azure the limitation is 512MB.

    We are obsessed in MS with security challenges, the workaround for myself for your reference is not to upload files, but to create a custom docker image, and on that docker image, between the docker commands to download the needed file with wget.

    I hope my answer help you solve your issue and I also provide my suggestion to product team for make this point clear.


    -Please kindly accept the answer if you feel helpful to support the community, thanks a lot.