Share via


temporary-path-credentials command group

Note

This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is in Public Preview.

Databricks CLI use is subject to the Databricks License and Databricks Privacy Notice, including any Usage Data provisions.

The temporary-path-credentials command group within the Databricks CLI contains commands to generate short-lived, downscoped credentials used to access external cloud storage locations registered in Databricks. These credentials provide secure and time-limited access to data in cloud environments such as AWS, Azure, and Google Cloud. See Unity Catalog credential vending for external system access.

databricks temporary-path-credentials generate-temporary-path-credentials

Generate a short-lived credential for directly accessing cloud storage locations registered in Databricks. The Generate Temporary Path Credentials API is only supported for external storage paths, specifically external locations and external tables. Managed tables are not supported by this API.

The metastore must have external_access_enabled flag set to true (default false). The caller must have the EXTERNAL_USE_LOCATION privilege on the external location; this privilege can only be granted by external location owners. For requests on existing external tables, the caller must also have the EXTERNAL_USE_SCHEMA privilege on the parent schema; this privilege can only be granted by catalog owners.

databricks temporary-path-credentials generate-temporary-path-credentials URL OPERATION [flags]

Arguments

URL

    URL for path-based access.

OPERATION

    The operation being performed on the path. Supported values: PATH_CREATE_TABLE, PATH_READ, PATH_READ_WRITE.

Options

--dry-run

    Optional flag to test the request without generating credentials.

--json JSON

    The inline JSON string or the @path to the JSON file with the request body

Global flags

Examples

The following example generates temporary credentials for read access to an S3 location:

databricks temporary-path-credentials generate-temporary-path-credentials s3://my-bucket/my-path PATH_READ

The following example generates temporary credentials for read-write access to an Azure storage location:

databricks temporary-path-credentials generate-temporary-path-credentials abfss://container@storage.dfs.core.windows.net/path PATH_READ_WRITE

The following example generates temporary credentials for creating a table in a GCS location:

databricks temporary-path-credentials generate-temporary-path-credentials gs://my-bucket/my-path PATH_CREATE_TABLE

The following example performs a dry run to test the request:

databricks temporary-path-credentials generate-temporary-path-credentials s3://my-bucket/my-path PATH_READ --dry-run

The following example generates credentials using JSON:

databricks temporary-path-credentials generate-temporary-path-credentials s3://my-bucket/my-path PATH_READ --json '{}'

Global flags

--debug

  Whether to enable debug logging.

-h or --help

    Display help for the Databricks CLI or the related command group or the related command.

--log-file string

    A string representing the file to write output logs to. If this flag is not specified then the default is to write output logs to stderr.

--log-format format

    The log format type, text or json. The default value is text.

--log-level string

    A string representing the log format level. If not specified then the log format level is disabled.

-o, --output type

    The command output type, text or json. The default value is text.

-p, --profile string

    The name of the profile in the ~/.databrickscfg file to use to run the command. If this flag is not specified then if it exists, the profile named DEFAULT is used.

--progress-format format

    The format to display progress logs: default, append, inplace, or json

-t, --target string

    If applicable, the bundle target to use