Maximum execution context or notebook attachment limit reached

Learn what to do when the maximum execution context or notebook attachment limit is reached in Databricks.

Written by rakesh.parija

Last published at: May 15th, 2023

Problem

Notebook or job execution stops and returns either of the following errors:

Run result unavailable: job failed with error message
Context ExecutionContextId(1731742567765160237) is disconnected.
Can’t attach this notebook because the cluster has reached the attached notebook limit. Detach a notebook and retry.

Cause

When you attach a notebook to a cluster, Databricks creates an execution context (AWS | Azure). If there are too many notebooks attached to a cluster or too many jobs are created, at some point the cluster reaches its maximum threshold limit of 145 execution contexts, and Databricks returns an error.

Solution

Configure context auto-eviction (AWS | Azure), which allows Databricks to remove (evict) idle execution contexts. Additionally, from the pipeline and ETL design perspective, you can avoid this issue by using:

  • Fewer notebooks to reduce the number of execution contexts that are created.
  • A job cluster instead of an interactive cluster. If the use case permits, submit notebooks or jars as jobs.
Was this article helpful?