After modifying the bundle deployment path, an error occurred in the devops databricks cicd pipeline

zmsoft 405 Reputation points
2025-05-09T06:20:36.9366667+00:00

Hi there,

I modified the workspace.root_path in the databricks.yml file, and then my pipeline failed to execute successfully.

The error message prompted was:

Error: unable to create directory at /Workspace/src/.bundle/azure-devops-demo/prd/files: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx does not have View permissions on 1069779813418078. Please contact the owner or an administrator for access.
Error: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx
 does not have Manage permissions on /src. Please contact the owner or an administrator for access.

My databricks.yml content :


# Filename: databricks.yml
bundle:
  name: azure-devops-demo

variables:
  job_prefix:
    description: A unifying prefix for this bundle's job and task names.
    default: azure-devops-demo
  spark_version:
    description: The cluster's Spark version ID.
    default: 15.4.x-scala2.12
  node_type_id:
    description: The cluster's node type ID.
    default: Standard_D4ads_v5

# These are the default workspace settings if not otherwise overridden in
# the following "targets" top-level mapping.
workspace:
  artifact_path: ${workspace.root_path}/artifacts
  file_path: ${workspace.root_path}/files
  root_path: /Workspace/src/.bundle/${bundle.name}/${bundle.target}
  state_path: ${workspace.root_path}/state

# These are the permissions to apply to experiments, jobs, models, and pipelines defined
# in the "resources" mapping.
permissions:
  - level: CAN_VIEW
    group_name: admins
  - level: CAN_MANAGE
    user_name: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx
  - level: CAN_MANAGE
    service_principal_name: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx


artifacts:
  dabdemo-wheel:
    type: whl
    path: ./Libraries/python/dabdemo

# This is the identity to use to run the bundle
run_as:
  - service_principal_name: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx


resources:
  jobs:
    run-unit-tests:
      name: ${var.job_prefix}-run-unit-tests
      tasks:
        - task_key: ${var.job_prefix}-run-unit-tests-task
          new_cluster:
            spark_version: ${var.spark_version}
            node_type_id: ${var.node_type_id}
            num_workers: 1
            spark_env_vars:
              WORKSPACEBUNDLEPATH: ${workspace.root_path}
          notebook_task:
            notebook_path: ./run_unit_tests.py
            source: WORKSPACE
          libraries:
            - pypi:
                package: pytest
    run-dabdemo-notebook:
      name: ${var.job_prefix}-run-dabdemo-notebook
      tasks:
        - task_key: ${var.job_prefix}-run-dabdemo-notebook-task
          new_cluster:
            spark_version: ${var.spark_version}
            node_type_id: ${var.node_type_id}
            num_workers: 1
            spark_env_vars:
              WORKSPACEBUNDLEPATH: ${workspace.root_path}
          notebook_task:
            notebook_path: ./dabdemo_notebook.py
            source: WORKSPACE
          libraries:
            - whl: '${workspace.root_path}/files/Libraries/python/dabdemo/dist/dabdemo-0.0.1-py3-none-any.whl'

targets:
  prd:
    mode: production

Any suggestions? How should I modify the content of the databricks.yml file?

Thanks&Regards,

zmsoft

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,436 questions
{count} votes

Accepted answer
  1. phemanth 15,570 Reputation points Microsoft External Staff Moderator
    2025-05-09T07:03:42.3266667+00:00

    @zmsoft

    It sounds like you're running into permission issues after modifying the workspace.root_path in your databricks.yml file for your Azure DevOps CI/CD pipeline. Specifically, the error messages suggest that the user running the pipeline doesn’t have the necessary permissions on the directories you've specified.

    Here are a few suggestions

    Check Permissions: Ensure that the user (or service principal) specified in your pipeline has the correct permissions. The error states that the user does not have View or Manage permissions. You may need to:

    • Go to the Azure Databricks workspace.
    • Navigate to your directory path (/Workspace/src/.bundle/...) and check the permissions.
    • Ensure that the user has the required permissions for both viewing and managing that directory. You can do this by going to the Admin Console and updating the permissions accordingly.

    Validate the Path: Ensure that the path you set for workspace.root_path in databricks.yml is correct. After the modification, double-check that the path exists and that there are no typos.

    Revoke and Reassign Permissions: Sometimes, removing and then reassigning permissions can help to refresh access. Remove the user from the directory and then re-add them with the necessary View and Manage permissions.

    Add Missing Permission Levels: If your setup requires certain groups (like admins in your example) to have higher permissions, check if they also need that particular user added to such groups for better management.

    Review Databricks Documentation: Refer to the documentation regarding CI/CD with Azure DevOps as there are specific instructions for defining and deploying bundles, along with managing access:

    If you continue experiencing issues, please consider these follow-up questions to clarify the situation:

    1. What user or service principal is running the pipeline?
    2. Have you checked if the directory /Workspace/src/.bundle/azure-devops-demo/prd/files exists and has the correct permissions set?
    3. Are there any specific group policies that might be affecting permissions?
    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.