03-19-2024 04:03 PM
My and other's PG32UCDM monitors have a clipping issue on PC when using the console HDR mode in Windows 11. Currently, the monitor clips at 450 nits in the Maximum Luminance window but it clips correctly in the Max Full Frame Luminance window. Is there a firmware update in the works for this or is this an RMA issue?
Solved! Go to Solution.
03-25-2024 09:49 AM
That might just mean that FF relies on the EDID reading of peak brightness, which is what @Rogex47 explained.
I personally don't have FF, so I have no clue.
03-25-2024 10:29 AM
@Silverhaze_NL yes, I do have The Last of Us. Will download and try it out.
@PiPoLiNiO If you tune the brightness for 1,000nits and then switch to TB400 the image will be brighter because of overexposure, the clouds for example lose all the details and become purely white. However you are also not wrong, I had some scenes (during cutscenes) where switching to TB400 did increase the brightness without introducing clipping or overexposure.
@DonDOOM Unfortunately I have been wrong about EDID, it is not the issue here. I have downloaded the "winter fox hdr" video and the brightness within MediaPlayerClassic is the same as watching it online, very dim. So I did some testing.
1. I have measured the luminance in "Console HDR" mode. You can see the value on the left side (second monitor) under "Current": 56.87 cd/m2.
Even the brightest spot in this scene on the far right was only around 82nits.
2. Measuring the same middle spot again but in TB400 mode lead to 109.59 cm/m2:
3. Now I opened a second instance of MPC and played a 100% white scene in HDR almost at full screen.
If I activated fullsceen mode the whole image was black, hence I sticked to borderless, but it is quite close to 100% fulllscreen.
Measured luminance was 257.18 cd/m2 in "Console HDR" and 263.65 in "TB400":
4. Trying MPC almost fullscreen without UI but with a small black border:
5. Last but not least I have disabled HDR and measured the "Winter Fox HDR" clip luminance in SDR.
117,26nits in SDR, that's pretty much 2x the luminance of HDR in "Console HDR".
If MSI handels the HDR differently @PiPoLiNiO's claim about it being twice as bright makes sense.
I hope this test makes sense, but please feel free to correct me if I am wrong 🙂
Whether this is a bug or intentional I don't know, but same as others here I do not find it acceptable HDR brightness being half of the SDR, especially when competitor models seem to perform better in this regard and I really hope for ASUS officials to look into that matter and give us official feedback.
03-25-2024 11:52 AM
I applaud the effort @Rogex47 ! Great work. Perfect showcase of the issue we're talking about.
I estimated it was 100 nits in the 'winter fox hdr' video, but seeing it measure in at 50-80 nits just makes me wonder even more how this got through testing, both on ASUS' side and the reviewers.
Please contact customer support and send them this write up. I'll do the same using your post as a reference. I've already done so last week, trying to formulate the issue as best as possible, but this makes it perfectly clear. If I had the tools I would've done it myself.
Again, hats off to you for the clear and concise explanation!
03-25-2024 11:23 AM
@Rogex47 Well done, buddy. And that's what it was all about 🙂 At MSI, everything was fine. HDR brightness in bright scenes was the same in 1000 and True Black modes.
03-25-2024 11:41 AM
I'm wondering something else. Do you really want to leave the defective monitor instead of returning it to the store? How can you allow a corporation to push you around like this?
Today I reported my desire to return it to the store and now I have 14 days to return it. If Asus does not release the firmware and continues to ignore customers, I will send it back. Every self-respecting person should do the same, because if you agree to such a thing, in 2-3 years you will receive a model at the premiere where HDR will not even be turned on because someone will forget to check
03-25-2024 12:01 PM
I don't.
If there is no fix soon I will definitely return it now that I have actual luminance numbers to prove that this monitor is not working correctly.
03-25-2024 01:57 PM - edited 03-25-2024 02:00 PM
This isn't a knock on your testing, but what you've shown is how ABL works in a high APL scene. Could it be not working as intended? Possibly, but it could also be tuned very aggressively. Here are 3 different tests you can use to help diagnose ABL(it's a weTransfer link). The ABL gradient test will show you how aggressively it kicks in, and yes, it does exist to some extent in all HDR modes. Additionally, if you measure the white dress on the tonemapping test, you'll see that the white dress will reach similar peak brightness levels of around 600nits across all 1000nit HDR modes when maxCLL is set to the a standard 4000nits, compared to TB400 which can only hit 450nits.
Regarding some of your methodology... Did you verify that the video you ripped for use in MPC retained the HDR metadata? Also, what video card do you use? MUB reported HDR brightness discrepancies between Nvidia and AMD. I'm not questioning that the scene you tested darker, I've seen the same thing myself, just that there are many considerations one should address when measuring/testing. Lastly, not that you used it for this particular test, but it's worth mentioning that the built in pattern generator for DisplayCAL may not be bitperfect depending on ones config. Since you have access to a meter and DisplayCAL, you can use the dogegen pattern generator that supports Calman, ColourSpace, HCFR, and DisplayCAL in SDR and HDR. On Nvidia, make sure to use the Override to Reference Mode to prevent the GPU from intervening with the patterns.
I was on set most of today, so I haven't had time to look into this some more. I do have more testing I'd like to conduct at some point this week.
03-25-2024 04:15 PM
@Kundica all good. Please keep in mind that to me it was less about testing the monitor and more about providing numbers on the HDR being too dim topic. And all tests, patterns, ABL etc aside, I think we can all agree that a monitor should NEVER be brighter in SDR than in HDR while showing HDR content and according to other users MSI and AW are doing a better job here and to me this would be a reason to return Asus and get a MSI model. Dont get me wrong, I dont want to, but I cant accept paying 1.499€ for a product which is inferior to a cheaper counterpart with the same panel.
To reply to your questions:
- Yes, the ripped video has HDR metadata. Win HDR mode even kicks in automatically if I try to play it without being in HDR mode. HDR format is SMPTE ST 2086, HDR10 compatible and maximum content light level is 1.000 according to MediaInfo
- I am using a RTX 4090
- The fullscreen 100% was a pattern I have dowloaded but I have also used MadVR/madTGP as pattern generator. But here I dont see a problem, fullscreen white is at around 250 nits like it should be.
03-25-2024 06:54 PM - edited 03-25-2024 06:55 PM
Actually, it's not so much about "should". SDR being brighter than HDR in high APL scenarios isn't unique to this Asus monitor, it's typical behavior on QD-OLED monitors. deflet_ on reddit replied to someone posting your results referencing this behavior and even provided a measurement showing it happening with the AW34. Read through the whole exchange on that post if you can. The same user has some fairly interesting posts in other threads if you read through them.
03-25-2024 07:57 PM
I do wonder, would there be a reason why the PG32UCDM is much dimmer than other models of the same panel (and also cheaper) apart from an incorrect EDID? (AW, MSI and even last-gen QD-OLED).