Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
The Apache Spark run series automatically classifies the following into respective run series:
- Your Apache Spark applications from your recurring pipeline activities, or manual notebook runs.
- Apache Spark job runs from the same notebook or Apache Spark job definition into respective run series.
The run series feature visually represents the duration trend for each Spark application instance along with the corresponding data input and output trendline for Spark applications. It also auto-scans the run series and detects whether there are any anomalous Spark application runs. This feature enables you to view details for a particular Spark application.
Access the monitor run series feature
You can access the monitor run series feature from the Monitoring hub's historical view:
Open the Microsoft Fabric portal and go to Monitoring hub menu.
Open your Spark job definition or notebook and expand its More options drop-down list and then select Historical runs.
Select the job you want to view and expand More options, then select Monitor run series.
You can access the monitor run series feature from the notebook or Spark job definition's Recent runs panel:
Open the Microsoft Fabric homepage and select a workspace where you want to view the job.
Selecting Spark job definition or Notebook item context menu shows the recent run option.
Select Recent runs.
select an application and expand its More options drop-down list and then select Monitor run series.
You can access the monitor run series feature from the Spark application Monitoring detail page:
View Spark application performance
In the Spark runs graph, you can view the duration trend of this run series. Each vertical bar represents an instance of the notebook/Spark job definition activity run, and its height indicates the run duration. You can also click on each running instance to view more detailed information and zoom in or out on specific time windows.
- Duration
- Duration(Anomaly)
- Read bytes
- Write bytes
Select the color icon to select or unselect the corresponding content in all graph.
When you select an instance of the notebook/Spark job definition activity run in the graph, the instance's Duration time distribution, Executors execution distribution, and Spark configuration are detailed at the bottom of the graph.
Focus mode lets you expand a visual to see more details. Maybe you have a visual that is a little crowded and you want to zoom in on it. This function is a perfect use of focus mode.
If the bar is marked red, an exception has been detected for that run instance. You can view the following information: Total duration, Expected duration, and Potential causes for this instance in the Anomalies panel.