इसके माध्यम से साझा किया गया


Workspace CLI (legacy)

Important

This documentation has been retired and might not be updated.

This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. See What is the Databricks CLI?. To find your version of the Databricks CLI, run databricks -v.

To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, see Databricks CLI migration.

You run Databricks workspace CLI subcommands by appending them to databricks workspace. These subcommands call the Workspace API.

databricks workspace -h
Usage: databricks workspace [OPTIONS] COMMAND [ARGS]...

  Utility to interact with the Databricks workspace. Workspace paths must be
  absolute and be prefixed with `/`.

Common Options:
  -v, --version  [VERSION]
  -h, --help     Show this message and exit.

Commands:
  delete      Deletes objects from the Databricks workspace. rm and delete are synonyms.
    Options:
        -r, --recursive
  export      Exports a file from the Databricks workspace.
    Options:
      -f, --format FORMAT      SOURCE, HTML, JUPYTER, or DBC. Set to SOURCE by default.
      -o, --overwrite          Overwrites file with the same name as a workspace file.
  export_dir  Recursively exports a directory from the Databricks workspace.
    Options:
      -o, --overwrite          Overwrites local files with the same names as workspace files.
  import      Imports a file from local to the Databricks workspace.
    Options:
      -l, --language LANGUAGE  SCALA, PYTHON, SQL, R  [required]
      -f, --format FORMAT      SOURCE, HTML, JUPYTER, or DBC. Set to SOURCE by default.
      -o, --overwrite          Overwrites workspace files with the same names as local files.
  import_dir  Recursively imports a directory to the Databricks workspace.

    Only directories and files with the extensions .scala, .py, .sql, .r, .R,
    .ipynb are imported. When imported, these extensions are stripped off
    the name of the notebook.

    Options:
      -o, --overwrite          Overwrites workspace files with the same names as local files.
      -e, --exclude-hidden-files
  list        Lists objects in the Databricks workspace. ls and list are synonyms.
    Options:
      --absolute               Displays absolute paths.
      -l                       Displays full information including ObjectType, Path, Language
  ls          Lists objects in the Databricks workspace. ls and list are synonyms.
    Options:
      --absolute               Displays absolute paths.
      -l                       Displays full information including ObjectType, Path, Language
  mkdirs      Makes directories in the Databricks workspace.
  rm          Deletes objects from the Databricks workspace. rm and delete are synonyms.
    Options:
        -r, --recursive

Delete an object from a workspace

To display usage documentation, run databricks workspace delete --help or databricks workspace rm --help.

databricks workspace delete --recursive "/Users/someone@example.com/My Folder"

Or:

databricks workspace rm --recursive "/Users/someone@example.com/My Folder"

If successful, no output is displayed.

Export a file from a workspace to your local filesystem

To display usage documentation, run databricks workspace export --help.

databricks workspace export --overwrite --format JUPYTER "/Users/someone@example.com/My Python Notebook" /Users/me/Downloads

This option can also be used to export notebooks from a Databricks Git folder:

databricks workspace export "/Repos/someone@example.com/MyRepoNotebook" /Users/me/Downloads

If successful, no output is displayed.

Export a directory from a workspace to your local filesystem

To display usage documentation, run databricks workspace export_dir --help.

databricks workspace export_dir --overwrite /Users/someone@example.com/my-folder /Users/me/Downloads/my-folder
/Users/someone@example.com/my-folder/My Python Notebook -> /Users/me/Downloads/my-folder/My Python Notebook.py
/Users/someone@example.com/my-folder/My Scala Notebook -> /Users/me/Downloads/my-folder/My Scala Notebook.scala
/Users/someone@example.com/my-folder/My R Notebook -> /Users/me/Downloads/my-folder/My R Notebook.r
/Users/someone@example.com/my-folder/My SQL Notebook -> /Users/me/Downloads/my-folder/My SQL Notebook.sql

Import a file from your local filesystem into a workspace

To display usage documentation, run databricks workspace import --help.

Only files with the extensions .scala, .py, .sql, .r, .R can be imported. When imported, these extensions are stripped from the notebook name.

databricks workspace import ./a.py /Users/someone@example.com/example
./a.py -> /Users/someone@example.com/example/a

Import a directory from your local filesystem into a workspace

To display usage documentation, run databricks workspace import_dir --help.

This command recursively imports a directory from the local filesystem into the workspace. Only directories and files with the extensions .scala, .py, .sql, .r, .R are imported. When imported, these extensions are stripped from the notebook name.

To overwrite existing notebooks at the target path, add the flag --overwrite or -o.

tree
.
├── a.py
├── b.scala
├── c.sql
├── d.R
└── e
databricks workspace import_dir . /Users/someone@example.com/example
./a.py -> /Users/someone@example.com/example/a
./b.scala -> /Users/someone@example.com/example/b
./c.sql -> /Users/someone@example.com/example/c
./d.R -> /Users/someone@example.com/example/d
databricks workspace ls /Users/someone@example.com/example -l
NOTEBOOK   a  PYTHON
NOTEBOOK   b  SCALA
NOTEBOOK   c  SQL
NOTEBOOK   d  R
DIRECTORY  e

List objects in a workspace

To display usage documentation, run databricks workspace list --help or databricks workspace ls --help.

databricks workspace list --absolute --long --id /Users/someone@example.com

Or:

databricks workspace ls --absolute --long --id /Users/someone@example.com
NOTEBOOK           /Users/someone@example.com/My Python Notebook  PYTHON  1234567898012345
NOTEBOOK           /Users/someone@example.com/My Scala Notebook   SCALA   2345678980123456
NOTEBOOK           /Users/someone@example.com/My R Notebook       R       3456789801234567
DIRECTORY          /Users/someone@example.com/My Directory                4567898012345678
MLFLOW_EXPERIMENT  /Users/someone@example.com/My_Experiment               5678980123456789

Create a directory in a workspace

To display usage documentation, run databricks workspace mkdirs --help.

databricks workspace mkdirs "/Users/someone@example.com/My New Folder"

If successful, no output is displayed.