Events
31 Mar, 23 - 02 Apr, 23
The biggest Fabric, Power BI, and SQL learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Jobs that output no result or unexpected results are common troubleshooting scenarios for streaming queries. You can use the job diagram while testing your query locally in Visual Studio to examine the intermediate result set and metrics for each step. Job diagrams can help you quickly isolate the source of a problem when you troubleshoot issues.
An Azure Stream Analytics script is used to transform input data to output data. The job diagram shows how data flows from input sources (Event Hub, IoT Hub, etc.) through multiple query steps and, finally, to output sinks. Each query step is mapped to a temporary result set defined in the script using a WITH
statement. You can view the data as well as metrics of each query step in each intermediate result set to find the source of an issue.
Note
This job diagram only shows the data and metrics for local testing in a single node. It should not be used for performance tuning and troubleshooting.
Use this Quickstart to learn how to create a Stream Analytics job using Visual Studio or export an existing job to a local project. If you want to test the query with local input data, follow these instructions. If you want to test with live input, move to the next step.
Note
If you export a job to local project and want to test against a live input stream, you need to specify the credentials for all inputs again.
Choose the input and output source from the script editor and select Run locally. The job diagram appears on the right side.
Select the query step to navigate to the script. You are automatically directed to the corresponding script in the editor on the left.
Select the query step and select Preview in the popped up dialog. The result set is shown in a tab in the bottom result window.
In this section, you explore the metrics available for each part of the diagram.
Metric | Description |
---|---|
TaxiRide | The name of the input. |
Event Hub | Input source type. |
Events | The number of events read. |
Backlogged Event Sources | How many more messages need to be read for Event Hubs and IoT Hub inputs. |
Events in Bytes | The number of bytes read. |
Degraded Events | The count of events that had an issue other than with deserialization. |
Early Events | The number of events that have an application timestamp before the high watermark. |
Late Events | The number of events that have an application timestamp after the high watermark. |
Event Sources | The number of data units read. For example, the number of blobs. |
Metric | Description |
---|---|
TaxiRide | The name of the input. |
Row Count | The number of rows generated from the step. |
Data Size | The size of data generated from this step. |
Local input | Use local data as input. |
Metric | Description |
---|---|
TripData | The name of the temporary result set. |
Row Count | The number of rows generated from the step. |
Data Size | The size of data generated from this step. |
Metric | Description |
---|---|
regionaggEH | The name of the output. |
Events | The number of events output to sinks. |
Metric | Description |
---|---|
regionaggEH | The name of the output. |
Local Output | Result output to a local file. |
Row Count | The number of rows output to the local file. |
Data Size | The size of data output to the local file. |
If you don't need the job diagram anymore, select Close on the top right corner. After closing the diagram window, you need to start local testing again to see it.
Other job level metrics shows up in the pop up console. Press Ctrl+C in the console if you want to stop the job.
Power BI and Azure Data Lake Storage Gen1 output sinks are not supported due to authentication model limitations.
Only cloud input options have time policies support, while local input options do not.
Events
31 Mar, 23 - 02 Apr, 23
The biggest Fabric, Power BI, and SQL learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayTraining
Module
Visualize real-time data with Azure Stream Analytics and Power BI - Training
By combining the stream processing capabilities of Azure Stream Analytics and the data visualization capabilities of Microsoft Power BI, you can create real-time data dashboards.
Certification
Microsoft Certified: Azure Data Engineer Associate - Certifications
Demonstrate understanding of common data engineering tasks to implement and manage data engineering workloads on Microsoft Azure, using a number of Azure services.