Azure Databricks provides a comprehensive suite of tools and integrations to support your data processing workflows.
Data processing or analysis workflows with Azure Databricks Jobs
You can use an Azure Databricks job to run a data processing or data analysis task in an Azure Databricks cluster with scalable resources. Your job can consist of a single task or can be a large, multi-task workflow with complex dependencies. Azure Databricks manages the task orchestration, cluster management, monitoring, and error reporting for all of your jobs. You can run your jobs immediately or periodically through an easy-to-use scheduling system. You can implement job tasks using notebooks, JARS, Delta Live Tables pipelines, or Python, Scala, Spark submit, and Java applications.
You create jobs through the Jobs UI, the Jobs API, or the Databricks CLI. The Jobs UI allows you to monitor, test, and troubleshoot your running and completed jobs.
To get started:
- Create your first Azure Databricks jobs workflow with the quickstart.
- Learn how to create, view, and run workflows with the Azure Databricks jobs user interface.
- Learn about Jobs API updates to support creating and managing workflows with Azure Databricks jobs.
- Learn how to use dbt transformations in a workflow.
- Learn how to use Apache Airflow to manage and schedule Azure Databricks jobs.
- Learn how to use Databricks SQL tasks in a workflow.
- Learn how to use Python wheels in workflow tasks.
Transform your data with Delta Live Tables
Delta Live Tables requires the Premium plan. Contact your Databricks account representative for more information.
Delta Live Tables is a framework for building reliable, maintainable, and testable data processing pipelines. You define the transformations to perform on your data, and Delta Live Tables manages task orchestration, cluster management, monitoring, data quality, and error handling. You can build your entire data processing workflow with a Delta Live Tables pipeline, or you can integrate your pipeline into an Azure Databricks jobs workflow to orchestrate a complex data processing workflow.
To get started, see the Delta Live Tables introduction.
Submit and view feedback for