NIKHIL KUMAR - Thanks for the question and using MS Q&A platform.
For your first question, yes, you can create and deploy a Databricks workflow with a job cluster using Azure DevOps CI/CD pipeline. You can use the Databricks CLI to create and manage Databricks resources, including clusters, jobs, and notebooks. You can also use the Databricks REST API to create and manage these resources. You can integrate these tools into your Azure DevOps pipeline to automate the creation and deployment of your Databricks workflow.
For more details, refer to Create and run jobs using the CLI, API, or notebooks.
For your second question, there is no option to create a job cluster directly from the "Compute" tab in the Databricks workspace portal but, you can choose the compute cluster created under compute tab.
To create a job cluster, you can follow these steps:
- Navigate to your Databricks workspace and click on the "Workflows" tab.
- Click on the Jobs and "Create Job" button.
- From here you can create a new job cluster or choose from existing compute clusters as shown.
Hope this helps. Do let us know if you any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.