March 2024

These features and Azure Databricks platform improvements were released in March 2024.

Note

Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.

DBRX Base and DBRX Instruct are now available in Model Serving

March 27, 2024

Databricks Model Serving now supports DBRX Base and DBRX Instruct, state-of-the-art mixture of experts (MoE) language models trained by Databricks. Both models are part of Foundation Model APIs, where DBRX Instruct is a fine-tuned model available in pay-per-token serving endpoint regions and DBRX Base is a pretrained model available in limited provisioned throughput serving endpoint regions. See Use Foundation Model APIs.

Model Serving is HIPAA compliant in all regions

March 27, 2024

Databricks Model Serving is HIPAA compliant in all regions where the service is generally available. See Region availability for Model Serving supported regions.

Provisioned throughput in Foundation Model APIs is GA and HIPAA compliant

March 27, 2024

Databricks Foundation Model APIs now offers provisioned throughput model serving in general availability. As part of general availability, Foundation Model APIs provisioned throughput workloads are HIPAA compliant. See Provisioned throughput Foundation Model APIs.

MLflow now enforces quota limits for experiments and runs

March 27, 2024

MLflow now imposes a quota limit on the number of total parameters, tags, and metric steps for all existing and new runs, and the number of total runs for all existing and new experiments. See Resource limits.

The Jobs UI is updated to better manage jobs deployed by Databricks Asset Bundles

March 26, 2024

Because modifications to Azure Databricks jobs deployed by Databricks Asset Bundles should be applied only by updating the bundle configuration, these jobs are, by default, read-only when viewed in the Jobs UI. Previously, by default, these jobs could be modified in the UI and cause unintentional gaps between the configuration in the UI and the bundle configuration. However, an option is provided for cases where you must make emergency changes to a job. See View and run a job created with a Databricks Asset Bundle.

Google Cloud Vertex AI supported as model provider for external models

March 25, 2024

External models in Databricks Model Serving now supports models provided by Google Cloud Vertex AI. See Model providers for external models.

Interactive notebook debugging

March 22, 2024

Databricks now supports interactive Python debugging directly in the notebook for clusters in Single user or No isolation shared access mode. With interactive debugging, you can step through code line by line and view variable values to discover and fix errors in code. For more information, see Use the Databricks interactive debugger.

Self-service sign-up for private exchange providers in Marketplace

March 22, 2024

If you want to publish only private exchange listings in Databricks Marketplace, you can now sign up using a self-service workflow. To publish public listings, you still need to apply through the Databricks partner portal. See Sign up to be a Databricks Marketplace provider.

Databricks Runtime 15.0 is GA

March 22, 2024

Databricks Runtime 15.0 and Databricks Runtime 15.0 ML are now generally available.

See Databricks Runtime 15.0 and Databricks Runtime 15.0 for Machine Learning.

Databricks Repos changed to Git folders

March 21, 2024

The former Databricks Repos feature is now called “Git folders”. If you have existing Repos, they are preserved under the same file system paths. See What happened to Databricks Repos?.

Databricks Runtime 14.1 and 14.2 series support extended

March 20, 2024

Support for Databricks Runtime 14.1 and Databricks Runtime 14.1 for Machine Learning has been extended from April 11, 2024 to October 1, 2024.

Support for Databricks Runtime 14.2 and Databricks Runtime 14.2 for Machine Learning has been extended from May 22, 2024 to October 1, 2024.

See All supported Databricks Runtime releases.

Databricks ODBC driver 2.8.0

March 19, 2024

We have released version 2.8.0 of the Databricks ODBC driver (download). This release adds the following new features and enhancements:

  • Support for JWT assertion as the client credentials for OAuth.
  • Token renew support. For token passthrough authentication, you can now renew your token.
  • Update third party libraries: Arrow 15.0.0 (previously 9.0.0)(on Windows), libcURL 8.6.0 (previously 8.4.0), Zlib 1.3.1 (previously 1.2.13).
  • Accept Undetermined Revocation support.

This release also resolves the following issue in 2.8.0:

  • When UseNativeQuery is set to 1 for a cluster later than DBR 11, the connector returns an incorrect column number after SQLPrepare.

For more information, see the release notes or the Installation and Configuration Guide in the installation package.

Manage private endpoint rules (Public Preview)

March 19, 2024

You can now view and manage private endpoint rules for private link from serverless compute using the Azure Databricks account console. This feature will roll out to all accounts over one or more weeks. See Manage private endpoint rules.

Workspace access for Azure Databricks personnel

March 19, 2024

By default, Azure Databricks personnel do not have access to customer workspaces or to the production multi-tenant environments. Workspace admins can now grant Azure Databricks personnel access to their workspace for a temporary session in order to investigate an outage, security event, or to support your deployment. For more information, see Workspace access for Azure Databricks personnel.

HIPAA now supports serverless compute

March 15, 2024

The compliance security profile enhancements for HIPAA now apply to compute resources in the serverless compute plane. See HIPAA compliance features.

SQL warehouses for notebooks is GA

March 15, 2024

SQL warehouses for notebooks, now generally available, allow you to take advantage of fully managed, instant, and scalable compute for your SQL workloads within the rich, collaborative authoring environment of a notebook. For details, see Use a notebook with a SQL warehouse.

Delegate the ability to view an object’s metadata in Unity Catalog (Public Preview)

March 15, 2024

You can now grant users, service principals, and account groups permission to view a Unity Catalog object’s metadata using the new BROWSE privilege. This enables users to discover data without having read access to the data. A user can view an object’s metadata using Catalog Explorer, the schema browser, search results, the lineage graph, information_schema, and the REST API.

The BROWSE privilege can be granted on a catalog or on an external location. Granting BROWSE on a catalog automatically grants BROWSE to all current and future objects within the catalog. A user with the BROWSE privilege does not require USE CATALOG on the parent catalog or USE SCHEMA on the parent schema to view an object’s metadata.

See BROWSE.

New per region limit for private endpoints

March 14, 2024

To give customers more flexibility in managing serverless compute plane networking, Databricks now supports up to 100 private endpoints per region. The private endpoints can be distributed as needed across network connectivity configurations (NCCs). Previously, Databricks supported up to 10 private endpoints per NCC and 10 NCCs per region. See Configure private connectivity from serverless compute.

Databricks Runtime 15.0 (Beta)

March 11, 2024

Databricks Runtime 15.0 and Databricks Runtime 15.0 ML are now available as Beta releases.

Databricks Runtime 14.0 series support ends

March 11, 2024

Support for Databricks Runtime 14.0 and Databricks Runtime 14.0 for Machine Learning ended on March 11. See Databricks runtime support lifecycles.

New computation for sys.path and CWD in Repos

March 8, 2024

We’ve updated how sys.path and the current working directory (CWD) are computed for Python notebooks and files in Repos. There are no functional changes. For sys.path information see the sys.path spec.

Feature Serving is GA

March 7, 2024

With Databricks Feature Serving, data in the Databricks platform can be made available to models or applications deployed outside of Databricks. Like Databricks Model Serving endpoints, Feature Serving endpoints automatically scale to adjust to real-time traffic and provide a high-availability, low-latency service at any scale. For details, see What is Databricks Feature Serving?.

You can use Databricks Feature Serving to serve structured data for retrieval augmented generation (RAG) applications. For an example notebook, see online tables with RAG applications.

Predictive optimization available in more regions

March 5, 2024

Predictive optimization is now available in the following regions, in addition to those it was already available in:

  • australiaeast
  • brazilsouth
  • canadacentral
  • centralus
  • southeastasia

For a complete list of supported regions, see Supported regions list. For more information, see Predictive optimization for Delta Lake.