cancel
Showing results for 
Search instead for 
Did you mean: 

ASUS ROG STRIX Z690-I GAMING WIFI - secondary NVME (M2_2) issue

PeDeGe
Level 8
I recently purchased an Alder Lake system (ASUS Z690-I motherboard), but have problems with enabling the secondary NVME SSD (Samsung 980 PRO 2TB) in Windows 11. The drive is detected and visible in the BIOS, but does not show in Windows (or only for a very limited time).

The weird thing is: when installing the Intel RST driver, at around 22% into the process, the secondary NVME SSD is detected by Windows 11 (a notification appears) and useable for a little while. During that time I can copy files to it and the drive operates as normal. When the RST driver installation finishes, even before reboot, the drive is ejected again and unavailable in Disk Management again. I can reproduce this behaviour.

Hardware
--------
Motherboard: ASUS ROG STRIX Z690-I GAMING WIFI
CPU: 12th Gen Intel(R) Core(TM) i5-12600K
RAM: 2x16GB Crucial DDR5
NVME SSD #1: Samsung SSD 980 PRO 2TB -> connected to CPU (on M2_1), boot drive
NVME SSD #2: Samsung SSD 980 PRO 2TB -> connected to Chipset (on M2_2) -> problem M2 slot
SATA SSD: Samsung SSD 870 QVO 4TB -> not relevant here, but added for completeness

Software
--------
Windows 11 PRO -> fully updated
Intel_RST_Driver_Software_V19.0.0.1067_WIN10_WIN11_64-bit -> downloaded from ASUS (Z690-I) Product Support page

Some observations
-----------------
- Both NVME SSD are technically working correctly. I've switched the drives multiple times and installed Windows on both of them. I've also run self tests and used Samsung Magician for checking the health and performance.

- One time, after switching the drives in the motherboard M2 slots, the secondary drive (in M2_2) was actually active for a short while. During that time I managed to update the firmware of both drives, but shortly after the secondary drive was ejected again.

- During the Intel RST driver installation, the Windows Event Viewer generates the following warning: "The application \Device\HarddiskVolume3\Windows\System32\DriverStore\FileRepository\iastorvd.inf_amd64_815480839574a92b\RstMwService.exe with process id 3560 stopped the removal or ejection for the device PCI\VEN_8086&DEV_467F&SUBSYS_86941043&REV_00\3&11583659&0&70.
Process command line: C:\Windows\System32\DriverStore\FileRepository\iastorvd.inf_amd64_815480839574a92b\RstMwService.exe" -> this seems important, but I'm not able to determine what this exactly means.

- In Windows Event Viewer another error appears after a system reboot/restart: "The driver detected a controller error on \Device\RaidPort2.". I noticed other people experiencing the same problem in this forum, but for a different motherboard, although no resolution there yet. In that thread they concluded it might have to do with SSD power management and suggested to disable "PCI Express Native Power Management" in the BIOS, but that didn't work in my case. More information here: https://rog.asus.com/forum/showthread.php?126865-Hero-Z690-having-issues-with-getting-windows-errors...

- Tried Windows 10, but same issue occurs there.

Could you please assist? I'm very confused about this behaviour. For me, it seems now either a hardware malfunction (M2_2 slot defect), a BIOS issue (NVME Power management not working correctly) or a Chipset/RST driver issue.

Thank you.
1,642 Views
28 REPLIES 28

asayler wrote:
There seems to be a second, but unrelated, issue with M2 slots on a number of the Z690 chipset boards related to the PCIe power management settings: https://rog.asus.com/forum/showthread.php?126865-Hero-Z690-having-issues-with-getting-windows-errors.... You might want to try disabling "PCI Express Native Power Management" option in the BIOS as suggested in that thread. (Note: SR-IOV also needs to be disabled in the BIOS for this to have an impact, although I belive it is disabled by default.) Hopefully Asus can fix that in future BIOS firmware updates.


Thanks mate, I will give this ago and report back in a few days after some testing. Thanks again.

Looks like it did not work. Started up this morning and drive not seen. do a restart and is now there. something is really wrong with these motherboards.

Auspcbuilda wrote:
Looks like it did not work. Started up this morning and drive not seen. do a restart and is now there. something is really wrong with these motherboards.


Stupid question maybe, have you disabled fast startup? (windows control panel, power options, choose what power buttons do, change settings that are currently unavailable, uncheck fast startup). Your restart scenario suggests it might be turned on.

On disappearing nvme drives. I think I would suggest verifying drive firmware is latest and then look at the possibility of some kind of power supply issue. One of our fellow members tracked his problems down and it was power related apparently. The other thing I have seen is there was a board that had problems with slot 2 where the drive didn't seat well in the slot. So just reseating it.

have you disabled fast startup? -- Yes I have, same with on or off
verifying drive firmware is latest -- All up to date firmware
power related apparently -- 3 different PSU's, pretty sure its not it
drive didn't seat well. So just reseating it -- this has been done many times as well.

I think they really made a really bad design with this MB. Looks like this one will be getting returned, and I will just get a MSI board instead. This board has been nothing but a problem. really upsetting tho as this is the first time I went all Asus with my build because they are supposed to be one of the best, but this has been the worst motherboard I have ever had. (both of them as I have already RMA one)

Auspcbuilda wrote:
have you disabled fast startup? -- Yes I have, same with on or off
verifying drive firmware is latest -- All up to date firmware
power related apparently -- 3 different PSU's, pretty sure its not it
drive didn't seat well. So just reseating it -- this has been done many times as well.

I think they really made a really bad design with this MB. Looks like this one will be getting returned, and I will just get a MSI board instead. This board has been nothing but a problem. really upsetting tho as this is the first time I went all Asus with my build because they are supposed to be one of the best, but this has been the worst motherboard I have ever had. (both of them as I have already RMA one)


From there I would do the following as a last effort. Bios defaults, good cold reset, whereby you pull the power plug and ground it while pressing the power button. Startup and if drive is working go into windows and reinstall driver for the drive controller by just delete it from device manager and restart and it reinstall. Other comment on intel drive controller is some software for it is available from microsoft store. Check microsoft store for updates.

I have tried all the above and still not working. I have decided to just return this MB now. I have had enough of it to be honest. Last time I buy Asus, not worth the extra price.

Auspcbuilda wrote:
I have tried all the above and still not working. I have decided to just return this MB now. I have had enough of it to be honest. Last time I buy Asus, not worth the extra price.


Understandable. Sometimes you are just beating a dead horse and its hard to draw the line.

This is probably unrelated, but I just discovered something a month or so after the fact that once remedied resulted in better NVMe SSD performance. So awhile back I used MoKiChU's tool for testing the security vulnerability of the MEI. At the time the MEI firmware was original and the driver was from the support page. I failed the test and wanted to try updating it. So I followed MoKiChU's instructions to update firmware and driver. It worked and the security issue was fixed. I use all his driver packages and they have been great. I also use the driver store explorer tool from his threads. This tool makes it easy to keep your driver store uncluttered by removing old drivers that it identifies. Well to get to the end of the story, it did not remove an old MEI driver that it should have, or maybe just that when I installed the last time, something didn't install right. I found that there were two versions of heci.inf in the store and I think the older one was still being used but kind of broken. When I deleted the older one with the driver store explorer, I saw a definite improvement in my SSD speed. I would have never suspected MEI had a problem, there were no other indications to go on. Good luck with your next project.

Edit: I just learned that Driver Store Explorer is operating like it should when it does not mark system drivers for deletion as a safety precaution.

mark0409mr01
Level 7
I’m also having a very similar problem with this board, I was trying to create a RAID 0 array with 2x SN850’s but just ended up going round in circles due to the problems with M.2_2 I have reseated the M.2 stack PCB numerous times and am confident everything is seated correctly, to me this appears to be a BIOS/Firmware level issue so can hopefully be fixed with an update

My thread is here*
*https://rog.asus.com/forum/showthread.php?128022-ROG-STRIX-Z690-I-GAMING-WIFI-NVME-PCIE-RAID-2x-SN85...

I have a potentially related issue with the M2_2 slot on my Z690-I as well, but the symptoms are slightly different.

OS: Windows 10 21H2 (build 19044.1526)
BIOS version: 1003

M2_1 drive: Samsung 970 EVO
M2_2 drive: Samsung 960 EVO



























Starting Power State M2_2 detected by OS
AC Power off yes
Shutdown yes
Sleep yes
Reboot from OS NO
Reboot from UEFI (e.g. after settings change) NO


At the same time, I am also getting hundreds of WHEA-Logger Event 17 errors in windows, for the PCI Express Root Port.
I have tried re-seating the drive, reseting BIOS defaults, setting PCIe mode to 3, and disabling ASPM as suggested in other threads, which does not change the outcome.

hi guys, after a month of running my raid 0 with m2 pci exp 4x wd black 850, this morning I turned on the pc and I lost the m2 2 slot and the raid 0 .. excellent ... actually while I was installing the two m2 in my z690i i guessed and predicted that that adapter with the flat had not been done a good project job .. i'm still trying to understand if the problem is mechanical or logical or pciexpress speed management .. but why did the problem come out after more than a month?
I had a reliable and perfect functioning for a month or more
I did some cross tests disassembling my m2, and I did not understand the logic of operation in the bios, I must say that every time the slot reappears and then disappears by inserting the second M2 -2, performing a reset or bios restore, every now and then they are seen the two m2 ..
did you finally understand what it is?
After some tests ..
update the situation:. I have acted on the operation of the speed of the pci express link by putting it from auto to pci 3x, and I have seen my installation of the so working,
so the problem is on the pciexp speed management, i think based on the cables and the construction of that ugly castle or sandwich of m2 adapters .. but here we are at an excessive limit for the configuration of the mobo. I believe that there is not a good construction of the adapters or connection cables for such a high speed of pci express data
My build or build is a bit crazy and too high for speed, i have raid 0 of two black wd850s, but i think it is too much, here the raid is difficult, remember that one pci exp m2 track is on the cpu and one on the chipset ... but I have the cpu I9 12900k and gpu and 3080 ...

I try to work on ..

my idea is that the connection of the m2-2 is badly designed, the pci 4x mode is not guaranteed by the connections, so going down to 3x things get better, in fact i am trying like this