Thanks for reaching out. Currently, there's no way to increase memory limit in Classic Studio. We encourage customers to try Azure Machine Learning designer (preview), which provides similar drag and drop ML modules plus scalability, version control, and enterprise security. Furthermore, with Designer, the endpoints are deployed to AKS where no limit other than cluster resource is imposed.
Out-of-memory error webservice deployed with Azure ML Studio
I have a webservice which exposes a predictive model. It has been deployed with Auzure ML Studio. Since the last model re-training and webservice deployment, in circa 1% of the cases in production, I get the following out-of-memory (possibly correlated) errors:
1) "The model consumed more memory than was appropriated for it. Maximum allowed memory for the model is 2560 MB. Please check your model for issues."
2) "The following error occurred during evaluation of R script: R_tryEval: return error: Error: cannot allocate vector of size 57.6 Mb"
Please note that these errors occur exclusively while trying to consume the webservice, and not while model training, evaluation and deployment.
Also, consuming the webservice in batch mode, as suggested here, is not a viable option for our business use case.
Is there a way to increase the memory limit for Azure webservices?
Thank you