After buying the card I benchmarked it and noticed it was worse than my previous card. I ran it in Linux and games and benchmarks were absolutely fine. Since then I've reinstalled Windows multiple times, Win11 and 10, messed around with ReBar, tried many different drivers among other things I can't remember. It's running with all 16 lanes at gen 3. It can't be a PSU issue, CPU bottleneck, thermal throttling, as none of these can make sense given it works under Linux just fine. It's performing worse than my 3070m chinese frankenstein card, which is roughly around a 1080ti. I 100% expected my i5-10400f to bottleneck the card, but the performance difference between Linux and Windows just doesn't make sense.
Here are some benchmarks, all ran on the same system, just different GPU:
RDR2 , 3440x1440 XB1X settings : Linux - 112 avg | Windows 62 avg
Furmark (1080 - 1440p) : Linux - 256, 163 | Win10 - 256, 163 | Win11 - 267 184
Cinebench GPU benchmark - 1286. A 2070 Super is benchmarked at '6306' for reference. Unfortunately would not run under Wine.
Superposition 1080p Medium | 6800xt, Win10 : 14029, Win11 : 13838 | Linux (native) : 22110 | 6600xt : 17044 | 1080 Ti : 19115 | 3070m : 19870
Furmark seems to be the outlier in these benchmarks for some reason. It actually utilises the GPU and brings the core clocks and power draw up. Other aps / games dont draw nearly as much wattage, and the clock speed goes to maybe 1400 at max. Windows also feels sluggish, with low fps animations and application windows being laggy (https://streamable.com/50ikq6). Every other game I've played has god awful performance in comparison to linux, Holdfast 40-50 to ~150 in Linux, Deep Rock 80-90 compared to 130-160, In a new minecraft world around 150-300 compared to 700-1200. ETS2 runs at no more than 20fps for some unknown reason. Cyberpunk just refuses to run on both Win10 and 11, but runs fine on my other card and in Proton. All of these games having god awful frame-times.