I have a webservice which exposes a predictive model. It has been deployed with Auzure ML Studio. Since the last model re-training and webservice deployment, in circa 1% of the cases in production, I get the following out-of-memory (possibly correlated) errors:
1) "The model consumed more memory than was appropriated for it. Maximum allowed memory for the model is 2560 MB. Please check your model for issues."
2) "The following error occurred during evaluation of R script: R_tryEval: return error: Error: cannot allocate vector of size 57.6 Mb"
Please note that these errors occur exclusively while trying to consume the webservice, and not while model training, evaluation and deployment.
Also, consuming the webservice in batch mode, as suggested here, is not a viable option for our business use case.
Is there a way to increase the memory limit for Azure webservices?