how can understand each jobs consuming resource power of allocated job cluster.

SADIQALI KP 41 Reputation points
2023-09-27T09:04:17.9666667+00:00

How can understand how much allocated compute power is consuming each jobs of assigned job clusters.

How can determine how much compute power each job consuming of allocated job cluster when in running? I need to know the resource usage of each job to optimize the cluster's performance.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,150 questions
0 comments No comments
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 88,471 Reputation points Microsoft Employee
    2023-09-28T08:03:44.34+00:00

    @SADIQALI KP - Thanks for the question and using MS Q&A platform.

    To determine how much compute power each job is consuming of allocated job cluster when it is running, you can use the Spark Application UI.

    User's image

    In the UI, Executors displays Summary and Detail views of the configuration and consumed resources. You can determine whether to change executors values for the entire cluster or a particular set of job executions based on the information provided in the UI.

    User's image

    This guide walks you through the different debugging options available to peek at the internals of your Apache Spark application: Debugging with the Apache Spark UI

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.