@SADIQALI KP - Thanks for the question and using MS Q&A platform.
To determine how much compute power each job is consuming of allocated job cluster when it is running, you can use the Spark Application UI.
In the UI, Executors displays Summary and Detail views of the configuration and consumed resources. You can determine whether to change executors values for the entire cluster or a particular set of job executions based on the information provided in the UI.
This guide walks you through the different debugging options available to peek at the internals of your Apache Spark application: Debugging with the Apache Spark UI
Hope this helps. Do let us know if you any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.