How to allocate the required memory?

Siddhesh Bhurke 46 Reputation points
2021-01-11T11:23:18.757+00:00

Error : cannot allocate vector of size 20.5 Gb Error

I am calling multiple notebooks within one notebook that trains my model and saves the results.

dbutils.notebook.run("../path/to_model_file",   timeout_seconds = 48000,  arguments = {"yearperiod":arg_period})

`

I am working on azure databricks and when I run a function it throws this error. My code is written in R

My cluster configuration is -
6.4 (includes Apache Spark 2.4.5, Scala 2.11)

worker type:
standard_ds4_v2 28gb 8 cores 1.5DBU 2-8 workers

driver type:
standard_ds3_v2 14gb 4 cores 0.75DBU

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,527 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Saurabh Sharma 23,851 Reputation points Microsoft Employee Moderator
    2021-01-15T23:49:22.553+00:00

    @Siddhesh Bhurke I have talked to the products team and as this would require deeper inspection of code and environment it has been recommended to create a support ticket.

    In case you have any restrictions creating a support ticket then please send an email to azcommunity@microsoft.com with your subscription id with this Q&A thread link and subject as "Attn: Saurabh' and I will help providing a free support ticket for you.

    Also, provide the support ticket number here so that I can work internally to expedite the resolution.

    Thanks
    Saurabh


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.