MemoryError: Unable to allocate 3.35 GiB for an array with shape (3000000, 300) and data type float32
I'm trying to deploy a model using Azure Machine Learning. In the init() method of the score.py (the entry script), I try to load in a Google Word2Vec model (https://mccormickml.com/2016/04/12/googles-pretrained-word2vec-model-in-python/). When trying to create an endpoint, the following exception is thrown:
MemoryError: Unable to allocate 3.35 GiB for an array with shape (3000000, 300) and data type float32
I'm using a Standard_DS12_v2 compute instance, from which I would expect that the specified ram and storage should be sufficient to handle the 3.35 GiB.
Any suggested solutions? Thanks a lot