Jaa


Cluster configuration for Databricks Connect

Note

This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above.

This article lists configuration settings for Azure Databricks compute required for Databricks Connect to connect to them. This information applies to the Python and Scala version of Databricks Connect unless stated otherwise.

Databricks Connect enables you to connect popular IDEs such as Visual Studio Code, PyCharm, RStudio Desktop, IntelliJ IDEA, notebook servers, and other custom applications to Azure Databricks clusters. See What is Databricks Connect?.

Requirements

  • A Azure Databricks account and workspace that have Unity Catalog enabled. See Set up and manage Unity Catalog and Enable a workspace for Unity Catalog.
  • A Azure Databricks cluster with Databricks Runtime 13.3 LTS or higher installed.
  • The Databricks Runtime version of your cluster must be equal to, or higher, than the Databricks Connect package version. Databricks recommends that you use the most recent package of Databricks Connect that matches the Databricks Runtime version. If you want to use features that are available in later versions of the Databricks Runtime, you must upgrade the Databricks Connect package. See the Databricks Connect release notes for a list of available Databricks Connect releases. For Databricks Runtime version release notes, see Databricks Runtime release notes versions and compatibility.
  • The cluster must use a cluster access mode of Assigned or Shared. See Access modes.

Programmatic validation

In Databricks Connect 14.3 and above, DatabricksSession.builder introduces validateSession, which runs a series of validations to ensure that the preceding requirements are met.

In Databricks Connect for Python, the databricks-connect binary has a test subcommand that performs the same set of validations.

This command should be run on the terminal with an active Python environment which includes Databricks Connect, and with the set of default credentials configured. To configure these credentials, see Configure connection properties.

databricks-connect test

The command will fail with a non-zero exit code and an appropriate message when any of the requirements are not met.

Disabling Databricks Connect

Databricks Connect (and the underlying Spark Connect) services can be disabled on any given cluster.

To disable the Databricks Connect service, set the following Spark configuration on the cluster.

spark.databricks.service.server.enabled false

Next steps