cancel
Showing results for 
Search instead for 
Did you mean: 

W-3175X + Asus Dominus Extreme + Strix RTX3090 OC HyperThreading GPU usage issue

xarot
Level 11
Hello,

I already opened a support ticket to NVIDIA but maybe over there might be another owners of this Dominus Extreme board on this forum?

I have both i9-7980XE (18c/36t) and W-3175X (28c/56t). On 7980XE everything is fine with Asus Strix RTX3090 OC. But with W-3175X, the GPU usage goes as low as 63 % on 3DMark Time Spy resulting in only 13-14k overall score. The CPU test performance is as expected as well as Cinebench etc. However, there is a fix for low GPU usage on W-3175X, by disabling hyperthreading to make it a 28c/28t CPU, then CPU usage is pegging at 99-100% constantly and getting good scores, 19-20k. Another option is to disable at least 8 cores in BIOS, then performance is not as good as by disabling HT but if all cores and HT are enabled, the performance takes a huge nosedive. I have tried all the Windows performance plans and options in the Nvidia Control Panel and disabling G-Sync etc. Curious if the same issue is on AMD Threadripper CPUs as well like 3960X, 3970X or 3990X. Also the issue is present on 461.x and 457.x drivers.

Also score of old Fire Strike is 27k HT on vs 33K HT off. Cyberpunk gets a nice boost in FPS also with HT disabled.

HT on: https://www.3dmark.com/spy/17971572

Is this NVIDIA or ASUS issue? Unfortunately I don't have an AMD GPU to test, but it would be nice to test as well to rule out the board..
Main: i9-10980XE - Rampage VI Extreme Encore - 64 GB G.Skill Trident Z Royal 3600 CL16 - Strix RTX 3090 - Phanteks Enthoo Primo - Corsair AX1500i - Samsung 960 PRO 1 TB + Intel 600P 1TB - Water cooling
HTPC: i7-6950X - X99-M WS - 32 GB G.Skill RipjawsV DDR4-2400 - GTX1050TI - Bitfenix Pandora - Corsair AX860 - Intel 750 400 GB + Samsung 1 TB 850 EVO
All around: i9-7980XE - Rampage VI Extreme - 64 GB G.Skill 4000 CL18-19-19-39 - Strix RTX3090 - Phanteks P500A - Samsung 960 EVO 512 GB - Water cooling
972 Views
4 REPLIES 4

G75rog
Level 10
Even my Apex with a 7900X and it's 8 NVME drives has better overall performance with HT disabled.

I studied the Dominus when it was released and discovered it's a monster Xenon crippled by the C621 workstation chipset and it's workarounds. The Dimm2 NVME drives share bandwidth with PCIE X16 2,3,and 4. Boo. Had they done a proper chipset to keep the NVME and PCIE lanes seperate I would have bought one.

Then you go and put a PCIE 4 3090 in a PCIE 3 slot and cripple it from the start.
Disable HT (which removes the HT overhead) and enjoy the monster.

G75rog wrote:
Even my Apex with a 7900X and it's 8 NVME drives has better overall performance with HT disabled.

I studied the Dominus when it was released and discovered it's a monster Xenon crippled by the C621 workstation chipset and it's workarounds. The Dimm2 NVME drives share bandwidth with PCIE X16 2,3,and 4. Boo. Had they done a proper chipset to keep the NVME and PCIE lanes seperate I would have bought one.

Then you go and put a PCIE 4 3090 in a PCIE 3 slot and cripple it from the start.
Disable HT (which removes the HT overhead) and enjoy the monster.


Thanks, but not using any DIMM.2 ports, currently using only the Strix 3090 in primary slot and Asus Hyper X16 card in third slot. 3090 is running in PCIe 3.0 so unlikely the issue is there, and like I wrote 2080TI worked fine. So that's why I am suspecting something fishy is going on with the new card or drivers.
Main: i9-10980XE - Rampage VI Extreme Encore - 64 GB G.Skill Trident Z Royal 3600 CL16 - Strix RTX 3090 - Phanteks Enthoo Primo - Corsair AX1500i - Samsung 960 PRO 1 TB + Intel 600P 1TB - Water cooling
HTPC: i7-6950X - X99-M WS - 32 GB G.Skill RipjawsV DDR4-2400 - GTX1050TI - Bitfenix Pandora - Corsair AX860 - Intel 750 400 GB + Samsung 1 TB 850 EVO
All around: i9-7980XE - Rampage VI Extreme - 64 GB G.Skill 4000 CL18-19-19-39 - Strix RTX3090 - Phanteks P500A - Samsung 960 EVO 512 GB - Water cooling

xarot
Level 11
Oh, actually found something similar from AMD side. Maybe it indeed is the drivers or maybe it's just a 3DMark thing. Have to do further testing with Cyberpunk...

https://www.overclockers.co.uk/forums/threads/why-are-some-3090s-only-hitting-14k-gpu-in-timespy-ben...
Main: i9-10980XE - Rampage VI Extreme Encore - 64 GB G.Skill Trident Z Royal 3600 CL16 - Strix RTX 3090 - Phanteks Enthoo Primo - Corsair AX1500i - Samsung 960 PRO 1 TB + Intel 600P 1TB - Water cooling
HTPC: i7-6950X - X99-M WS - 32 GB G.Skill RipjawsV DDR4-2400 - GTX1050TI - Bitfenix Pandora - Corsair AX860 - Intel 750 400 GB + Samsung 1 TB 850 EVO
All around: i9-7980XE - Rampage VI Extreme - 64 GB G.Skill 4000 CL18-19-19-39 - Strix RTX3090 - Phanteks P500A - Samsung 960 EVO 512 GB - Water cooling

There are benchmarks / programs that do not work well with beyond a certian number of threads, it is possible that is the case. Also like the other user with 7900x mentioned the hyper threading on the skylake-x chips does seem to nosedive performance more than previous (like 6950x/5960x). I have it disabled on my 7980xe as the performance is much worse with it enabled.