How to set up databricks-bundle in Azure Databricks UI

Viet Tran 70 Reputation points
2025-07-07T14:35:28.33+00:00

Hello Mr./Ms.,

I'm a new bee with Databircks and databricks-bundle for CICD. I have 3 questions:.

  1. By chance, I saw this picture:

User's image

I tried to look for documents or videos related to Databricks Bundles in the UI (like in the image above), including how to set it up and deploy it on Databricks UI, but I couldnโ€™t find any. Could you give me some documents for these?

  1. I read about DAB and understand that it is used to deploy workflows and pipelines from development to higher environments in a CI/CD process. Can we use DAB to deploy just notebooks, without pipelines and workflows?
  2. Do we have any approach that migrates notebooks from a Repo to Shared in Databricks automatically?

User's image

Thanks!

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
0 comments No comments
{count} votes

Answer accepted by question author
  1. PRADEEPCHEEKATLA 91,496 Reputation points Moderator
    2025-07-09T20:49:08.3133333+00:00

    @Viet Tran - Thanks for the question and using MS Q&A platform.

    Databricks Asset Bundles are created, deployed to a workspace, then managed programmatically from your local development environment or directly in the workspace UI. Collaborating on bundles directly in the workspace allows for more rapid iteration and testing before moving to production. 1749832218955 (1)

    Note: Databricks Asset Bundles in the workspace is in Public Preview.

    How can I access Databricks Asset Bundles in my workspace?

    FYI, you do not have to install anything locally to use bundles in the workspace, but there are Databricks workspace requirements:

    • Workspace files must be enabled.
    • You must have a Git folder in which to create the bundle. To create a Git folder, see Clone a repo connected to a remote Git repository.
    • Serverless compute must be enabled.

    For more details, refer to Collaborate on bundles in the workspace and my post on LinkedIn: ๐Ÿš€ ๐๐ž๐ฑ๐ญ-๐†๐ž๐ง ๐ƒ๐ž๐ฏ๐Ž๐ฉ๐ฌ ๐Ÿ๐จ๐ซ ๐ƒ๐š๐ญ๐š ๐“๐ž๐š๐ฆ๐ฌ ๐ข๐ฌ ๐‡๐„๐‘๐„! ๐Ÿ”ง๐Ÿ“ฆ


    ๐˜›๐˜ฐ ๐˜ด๐˜ต๐˜ข๐˜บ ๐˜ช๐˜ฏ๐˜ง๐˜ฐ๐˜ณ๐˜ฎ๐˜ฆ๐˜ฅ ๐˜ข๐˜ฃ๐˜ฐ๐˜ถ๐˜ต ๐˜ต๐˜ฉ๐˜ฆ ๐˜ญ๐˜ข๐˜ต๐˜ฆ๐˜ด๐˜ต ๐˜ถ๐˜ฑ๐˜ฅ๐˜ข๐˜ต๐˜ฆ๐˜ด ๐˜ข๐˜ฏ๐˜ฅ ๐˜ช๐˜ฏ๐˜ด๐˜ช๐˜จ๐˜ฉ๐˜ต๐˜ด ๐˜ฐ๐˜ฏ ๐˜ˆ๐˜ป๐˜ถ๐˜ณ๐˜ฆ ๐˜‹๐˜ข๐˜ต๐˜ข๐˜ฃ๐˜ณ๐˜ช๐˜ค๐˜ฌ๐˜ด, ๐˜ฅ๐˜ข๐˜ต๐˜ข ๐˜ฆ๐˜ฏ๐˜จ๐˜ช๐˜ฏ๐˜ฆ๐˜ฆ๐˜ณ๐˜ช๐˜ฏ๐˜จ, ๐˜ข๐˜ฏ๐˜ฅ ๐˜ค๐˜ญ๐˜ฐ๐˜ถ๐˜ฅ ๐˜ช๐˜ฏ๐˜ฏ๐˜ฐ๐˜ท๐˜ข๐˜ต๐˜ช๐˜ฐ๐˜ฏ๐˜ด, ๐˜ง๐˜ฐ๐˜ญ๐˜ญ๐˜ฐ๐˜ธ ๐˜ฎ๐˜ฆ ๐˜ฐ๐˜ฏ ๐˜“๐˜ช๐˜ฏ๐˜ฌ๐˜ฆ๐˜ฅ๐˜๐˜ฏ.

    0 comments No comments

1 additional answer

Sort by: Most helpful
  1. Smaran Thoomu 32,530 Reputation points Microsoft External Staff Moderator
    2025-07-07T17:06:47.4066667+00:00

    Hi Viet Tran

    Thanks for sharing the screenshots and questions. Let's break down your questions:

    How to Set Up Databricks Bundles (DAB) in Azure Databricks UI

    As of now, Databricks Bundles (DAB) are primarily managed via CLI (Command Line Interface) using the databricks bundle commands. The UI youโ€™ve shared appears to show a preview or an internal bundle interface, but there is no official UI-based deployment or configuration for Bundles currently available publicly.

    To get started with DAB:

    Key setup steps:

    Install the Databricks CLI v0.205+

    Configure .databricks/bundle.yaml

    Deploy using databricks bundle deploy

    Note: That UI shown in your screenshot may appear only if you've initialized the repo using DAB and opened it inside Repos.

    Can DAB be used to deploy just notebooks (not pipelines/workflows)?

    Yes. DAB can deploy notebooks, workflows, jobs, and dependencies together. If you're only interested in deploying notebooks (without jobs/pipelines), you can configure a bundle to deploy just the notebook directory.

    For example, your bundle.yaml can specify:

    resources:
      notebooks:
        path: ./notebooks
    

    However, youโ€™ll still need to use the CLI to deploy it (databricks bundle deploy).

    How to Automatically Migrate Notebooks from Repo to Shared Folder?

    There's no built-in feature in Databricks for automatic migration of notebooks from a Repo to Shared workspace directly. However, you can utilize the Databricks CLI to script this process; for example, using the databricks workspace import command can help import notebooks programmatically. You would need to create a deployment script that reads notebooks from your Repo and imports them into the Shared workspace.

    I hope this helps. If you have any questions, please let us know.


    If the above response helps, please click โ€œAccept Answerโ€ and select โ€œYesโ€ to help others with similar questions.


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.