The manual test verifies that a mobile system on AC power mode is capable of playing protected and unprotected High-Definition content with no perceivable glitch during playback.
Tests in this feature area might have additional documentation, including prerequisites, setup, and troubleshooting information, that can be found in the following topic(s):
Make sure that the Video and Music apps are installed on the system. You can get both apps from the Store.
Make sure the .mp4 and .wmv extensions are registered to the Video app. To do this, right-click a sample file that has the appropriate extension, click Properties, click Open with, and select the Video app.
Make sure the .m4a and .wma files are registered to the Music app. To do this, right-click a sample file that has the appropriate extension, click Properties, click Open with, and select the Music app.
The HLK GlitchFree test plays back two video clips using the inbox Video application. The content is played back in Fullscreen while ETW logging is enabled in the background. After each scenario, the test post processes the ETW log and extracts metrics, which are used to determine whether the test passes or fails.
Pass/Fail Criteria & Metric Details
Glitch Metrics
Video Glitches - The Media Engine's Video renderer (SVR) detects when a frame is rendered late and it fires a video glitch event. The goal for this metric is 0. Provider and event details:
Microsoft-Windows-MediaEngine
Channel - MediaFoundationMediaEngine - 16
Level - win:Verbose - 5
Task - VideoFrameGlitch - 23
Dropped Frames - The Media Engine fires frame drop events when the source drops a frame. When frames are dropped, the user experiences glitchy video. The goal is 0. Provider and event details:
Microsoft-Windows-MediaEngine
Channel - MediaFoundationMediaEngine - 16
Level - win:Verbose - 5
Task - DroppedFrame - 18
DWM Schedule Glitches - The desktop window manager (DWM) fires a glitch event when DWM samples are rendered late. The goal for this metric is 0. The test starts tracking this event 500ms after the first PresentedFrame event (Task ID 19, Event ID 115). The test stops tracking this event 66ms after the last instance of PresentedFrame event (Task ID 19, Event ID 115). Provider and event details:
Total Device Creation time: The total device creation time must not exceed 50ms. Total device creation time is defined as DeviceCreation + CreateVideoDecoder, where the definition of those two metrics are:
DeviceCreation = The latency between the following two events
Driver Metrics- The ISR/DPC duration and ISR/DPC Storm tests aim at ensuring that device drivers are well behaved. The goal is to ensure that time critical multimedia threads can run on a regular basis, with limited interruptions from ISR/DPCs.
ISR/DPC duration: This check is designed to validate that individual ISR/DPC duration does not exceed a 3ms threshold.
ISR/DPC Storm: A cumulative duration of every ISR/DPC within a 10ms window must not exceed 4ms.
GPU VSync Cadence: This case ensures that the GPU DPC VSync cadence follows a well behaved pattern. Fluctuations in GPU DPC Vysnc frequency, during media playback can result in glitches during media playback. The test criteria establishes that the cadence fluctuation should not exceed +/- 50% of the average VSync cadence window. For instance, in a 60 Hz monitor, the expected VSync DPC cadence is 16.666 ms; consequently the test will fail if any VSync DPC is fired within less that 8.3 ms from the previous one or later than 24.9 ms from the previous one. When the duration between two vsyncs is greater than 24.9ms, this often results in a perceivable video glitch. When the distance between two vsyncs is less than 8.3ms, this is often caused by the driver double firing vsyncs, or vsyncs that are a few microseconds (us) apart.
How to enable verbose ETW logging for analysis
To collect more verbose ETW logs, change the user-settable parameter 'DoFullLogging' to 'true' before running the tests.
How to preserve the ETW logs for analysis in case of failure
To preserve the ETW logs for failed test cases, change the user-settable parameter 'CopyLogsOnFailure' to 'true' before running the tests. This will also copy the ETW logs of failed test cases to the controller and be included as part of HLK package to be shared for investigation.
Using Media Experience Analyzer to analyze failed ETW logs
You can use Media Experience Analyzer (MXA) to analyze failed ETW logs. The MXA tool is available as part of the Windows ADK.
Parameters
Parameter name
Parameter description
TestCycles
Number of cycles to run the test for
DoFullLogging
Enable flag for full logging of ETW traces in case of failure, and re-run this test.
CopyLogsOnFailure
Enable flag to copy ETW log traces to subfolder 'ETWlogs' in case of failure, and re-run this test. This also copies the failure logs to hlkx package to be shared for investigation
FrameCount
Minimum number of MF Events required during playback
Azure HPC is a purpose-built cloud capability for HPC & AI workload, using leading-edge processors and HPC-class InfiniBand interconnect, to deliver the best application performance, scalability, and value. Azure HPC enables users to unlock innovation, productivity, and business agility, through a highly available range of HPC & AI technologies that can be dynamically allocated as your business and technical needs change. This learning path is a series of modules that help you get started on Azure HPC - you