06-23-2024 11:23 AM
I just purchased a motherboard in subject with some other items. Here is the list:
---
I purchased the motherboard, installed CPU, SSD and RAM without any problems.
Then I updated BIOS to the latest version (2613) found on the support website.
Then I installed Windows 11 (with Armory crate and related drivers) without any problems.
In summary for above - basic setup and install works well, although sometimes it took a very long time for "memory check" (orange Q-LED) before boot into windows.
Finally, my aim is to install and use multiple (up to 6) GPUs as I have some computational tasks to run which do not require a lot of communication with CPU, so bandwidth is not a problem for me. [This is similar to most miners, although i am not mining anything].
This is where I start having issues, where I am not able to connect more than 1 GPU.
I have tried following GPUs one-by-one into the first PCIEX16_1 slot and all are recognized and work well when the only one:
1. DUAL-RTX4070S-O12G
2. TUF-RX7900XT-O20G-GAMING
3. AMD Radeon Pro VII
Then I tried to connect a second card in the PCIEX16_2 slot (all possible combinations of those cards) and the PC would not start as it gets stuck at VGA Q-LED most of the time. Sometimes it does start, but in Windows only the first GPU is recognized. The other GPU is connected, fans are spinning but it is simply not visible in Device Manager.
This behaviour is not consistent as when it is stuck on VGA Q-LED, i might switch power off and then on, and the computer might either get stuck again or will start (but still the second GPU is not recognized and visible).
Similarly, I tried to use a PCIe riser to connect a GPU to either PCIEX1_1 or PCIEX1_2 slot, but without any success.
[Although connecting a GPU to the PCIEX16_1 slot using a Riser works; but only 1 GPU similar to the first tests].
I have tried changing the PCIE version to 2 and to 3 in BIOS, and sometimes it helps (both cards can be visible and used in Windows), but after a reboot or two again the second card is not recognized [feels like BIOS just forgets about the last good state]. So i conclude that basic system works, PCIe risers work, all GPUs work (one by one), but connecting more than 1 simply fails.
I did not see in any documentation that the Motherboard does not support multiple GPUs. Is there anything I am missing?The fact that sometimes on the boot I get different results indicates to me that the issue might be related to the motherboard. What else could I do or test to check it for sure?
Thank you!
---
I tried to submit a service request, but it is not being submitted (i see some javascript errors in the console). Therefore, I hope to get some support in this forum.
06-27-2024 01:38 AM
Hello, @vanoso
Since you haven't mentioned it, we recommend that you remove all M.2 SSDs or Hyper M.2 during the process of testing the motherboard and GPU connection.
Could you please confirm the following:
- If installing any single GPU in PCIEX16_2, can it be correctly recognized and output video signals?
- When installing any two GPUs, have you ensured in the BIOS that the "PCIEX16_1 Bandwidth Bifurcation Configuration" is set to [Auto Mode] and the "PCIEX16_2 Bandwidth Bifurcation Configuration" is set to [PCIE X8 Mode]?
Additionally, regarding your other post: URGENT: cannot submit a support request
I have updated our PM conversation, kindly check and confirm at your convenience.
Thank you.
06-27-2024 04:28 AM
Thank you for the guidance.
I can confirm that the "PCIEX16_1 Bandwidth Bifurcation Configuration" is set to [Auto Mode]" (it works the same in x8/x8 mode).
In my bios (v 2613) there is no Bifurcation setting for the PCIEX16_2. I can only configure the speed of PCIEX16_2 which shares bandwidth with M.2_3.
I am not able to test without SSD at the moment as this is the only type of disk I have, which is installed in M.2_1 slot.
Having done some more tests I can say that I can actually use 2 GPUs at the same time. But my problem is that they are just too bulky, and this is why I connected one of them using the riser such as this one (https://www.amazon.com/ELUTENG-Adapter-Capacitors-Powered-Extender/dp/B09HHGGRLP/).
Why would only one GPU be recognized via such riser and only via PCIEX16_1 slot?
07-01-2024 01:46 AM
Hello, @vanoso
Regarding installing GPUs, may I ask if two GPUs installed in PCIe slots without adapters can be correctly recognized?
We do not recommend using extension or adapter devices during installation, as this may introduce additional potential hardware compatibility issues.
Connecting too many GPUs at the same time (via extension connections) might cause bandwidth difficulties, causing some GPUs to throttle, perform poorly, or exhibit other abnormalities.
Thank you.
07-01-2024 08:19 PM
Hi @vanoso as you have correctly read the manual and mentioned that PCIEX16_2 shares bandwidth with M.2_3 slots. This also means that this PCIe Slot is connected to your B650 Chipset and not to your CPU. The only PCIe Slot connected to your CPU on your board is PCIEX16_1. This information is shared by @Jiaszzz_ROG by the EXPENSION SLOT picture he shared above.
I am not sure if you will be able to run a graphics card via a CHIPSET connected slot as the support will be limited or very tricky. This is why you are having issues. As @Jiaszzz_ROG mentioned here, you should just put 1 graphics card in PCIEX16_2 slot and see if it gets detected and works because if it does not then you already know that you can only use 1 GFX Card. Even if this test passes, getting these 2 GFX cards to work where 1 card is running via CPU and the other card via chipset is going to be extremely difficult.
My recommendation is that you should look for a board that provide dual GFX card support or in other words 2 PCI slots from the CPU. Looking at https://www.guru3d.com/review/amd-ryzen-9-7950x3d-processor-review/page-3/ page you can see that B650 chipset board can provide 2 PCI slots from CPU. It all depends upon which board supports this architecture.