In the end it seemed to have more to do with the application being fullscreen.
See how I fixed it
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
I am working on a WPF application that displays a grid of many images with scrolling/expanding animations that display a video when the user hovers the image.
When this application is run on an Intel CPU + NVidia GPU machine, the animations and video playback are very smooth at a maintained 60fps.
When run on a high-end AMD CPU + ATI GPU, the animations and video playback are noticeably laggy and the frames drop to ~30fps during animations/video playback. However - on these machines, the performance is if the user deactivates the application (e.g. alt-tabbing to another application) and then scrolls/hovers my program behind it to trigger the animations there are no frames dropped. Videos of a side-by-side comparison are linked at the bottom.
The Render Tier on both computers is 2, and enabling software-only rendering does not help address this.
When running a diagnostics session and performing the same animations on both Activated/Deactivated, the drop in frames spike is clearly seen Diagnostics: Red = Activated, Green = Deactivated. The UI thread does not appear to be exceeding 60%-70% at it's peaks,
Does anyone know why there would be a performance difference when my application becomes activated? The desired behaviour is for the animations/playback to always be smooth, as it is when the application is deactivated. It also does not appear to be very close to maxing out the UI thread either, and as far as I know, I have not written any specific tasks that only run if the application is activated.
In the end it seemed to have more to do with the application being fullscreen.
See how I fixed it