03-19-2024 04:03 PM
My and other's PG32UCDM monitors have a clipping issue on PC when using the console HDR mode in Windows 11. Currently, the monitor clips at 450 nits in the Maximum Luminance window but it clips correctly in the Max Full Frame Luminance window. Is there a firmware update in the works for this or is this an RMA issue?
Solved! Go to Solution.
03-25-2024 04:13 AM
The brightness is the same by all... 1000nits up to 2% window... about 450nits 10%window... and 250 in 100%.
The only difference is the lower "reference white" that makes the overall picture in some scenes darker.
03-25-2024 04:16 AM
Do you have this monitor? Or did you have MSI? Because I had both at home, so stop telling me lies 🙂 In bright scenes, the Asus is half as dark as the MSI.
03-25-2024 05:28 AM
@Richu6 wrote:
There is nothing that can be "too dim" or 'too bright"
When you can compare the exact same panel used by different brands, and this ASUS model is the outlier, then there for sure can be something that is comparably "too dim".
@Richu6 wrote:
The only difference is the lower "reference white" that makes the overall picture in some scenes darker.
What is going on in your head? How can you make that out to not be an issue or bug? Talk about mental gymnastics.
@Richu6? You'd rather nothing change? Defending a company / something that clearly doesn't function as it should or live up to it's potential comparatively. What a strange position to put yourself in. If it was intentional as you speculate, then ASUS should've at the very least made that clear so people could have made an informed decision to purchase the MSI model for example.
Besides that, your explanation makes no sense. Why have SDR brightness, commonly used for static desktop stuff, have higher 100% APL brightness capabilities compared to the HDR modes that also have logo detection for HUD elements in games etc.
And again, even if it was intentional, then ASUS needs to be held accountable for such ******ty business practices. Making decisions that are so detrimental to the HDR experience. And for what, lower chances of burn in? Read above on why that makes no sense.
Next to this obviously being an issue, these panels are already kept way back from their actual potential. They could release a firmware update with a 2% 1200-1300 nits, 10% 700-800 nits and 100% 300-400 nits HDR mode and it would be completely fine, burn-in wise.
We need some official communication by ASUS staff on this matter, it's already way overdue.
03-25-2024 05:52 AM
@DonDOOM wrote:
@Richu6 wrote:There is nothing that can be "too dim" or 'too bright"
When you can compare the exact same panel used by different brands, and this ASUS model is the outlier, then there for sure can be something that is comparably "too dim".
@Richu6 wrote:The only difference is the lower "reference white" that makes the overall picture in some scenes darker.
What is going on in your head? How can you make that out to not be an issue or bug? Talk about mental gymnastics.
@Richu6? You'd rather nothing change? Defending a company / something that clearly doesn't function as it should or live up to it's potential comparatively. What a strange position to put yourself in. If it was intentional as you speculate, then ASUS should've at the very least made that clear so people could have made an informed decision to purchase the MSI model for example.
Besides that, your explanation makes no sense. Why have SDR brightness, commonly used for static desktop stuff, have higher 100% APL brightness capabilities compared to the HDR modes that also have logo detection for HUD elements in games etc.
And again, even if it was intentional, then ASUS needs to be held accountable for such ******ty business practices. Making decisions that are so detrimental to the HDR experience. And for what, lower chances of burn in? Read above on why that makes no sense.
Next to this obviously being an issue, these panels are already kept way back from their actual potential. They could release a firmware update with a 2% 1200-1300 nits, 10% 700-800 nits and 100% 300-400 nits HDR mode and it would be completely fine, burn-in wise.
Why have SDR brightness, commonly used for static desktop stuff, have higher 100% APL brightness capabilities compared to the HDR modes
Thats not true, 100% APL brightness is the same in both SDR and HDR at around 250nits
03-25-2024 06:15 AM
@Rogex47 What? Have you just not read the thread and are not aware of what issue we're talking about?
There's been a variety of ways listed in this thread on how to test this issue out for yourself. I'll post one of them for you again:
"Easily test this out yourself by using an HDR capable browser, looking up 'winter fox hdr' on youtube and switching between the True Black 400 and Console mode."
03-25-2024 07:20 AM
I have read the thread and I have also measured the brightness in HDR with a Calibrite Display Plus HL and my measurements are all in line with reviewers like Monitors Unboxed reaching from 1,000+ nits for 2% APL down to around 250nits at 100%. Also in Cyberpunkt 2077 I can set the peak brightness to around 1,100nits without clipping. Also in Final Fantasy VII Remake HDR works like it is supposed too.
However you are right about "Winter Fox HDR". Also in Battlefield V I couldnt set peak brightness higher than 500nits without clipping.
I assume that this is due to "wrong" EDID entry which is reporting 455nits peak brightness. Some applications, like Nvidia RTX HDR, rely on the EDID information and set it as max brightness. So I guess what is happening here is your browser pulling information from EDID and setting peak brightness to 455nits while the monitor is applying the ABL behavior of "Console HDR" mode.
For my measurements I have downloaded HDR test patterns here:
https://diversifiedvideosolutions.com/hdr-10.html
And played them in MediaPlayerClassic while measuring brightness with DisplayCAL. With numpad 9 you can make the white area bigger up until it fills the sceen. You can try it yourself and you will see fullscreen white in MediaPlayer is significantly brighter than what you get in the YouTube video.
So is there an issue with the current HDR implementation?
If an application relies on the EDID information, then yes. If the application doesnt rely on EDID then HDR works as intended.
But in the end it is an assumption and we need official feedback.
03-25-2024 07:33 AM - edited 03-25-2024 07:35 AM
Do you have The Last Of Us? I really want to see what you think of that game with Console HDR and then Switch to HDR400.
Set everything to max Brightness in the game setting, i tell you it is way to dark on Console HDR.
On HDR400 it looks way better, but it just doesn't look right. Colors are washed out, like sunlight on the walls, ground, inside houses etc.
I love that game, but it just looks so bad on Console HDR i don't even want to play it anymore.
It just looks gray and dull.
03-25-2024 07:46 AM - edited 03-25-2024 08:45 AM
@Rogex47 Thanks for the explanation. If it really is as simple as an incorrect EDID entry issue then the supposed fix for that releasing this week (according to one post here) should be the end of it.
Let's hope it's true, but as others have pointed out, all the games I have tested show the same issue on fullscreen bright scenes.
03-25-2024 09:55 AM
Cyberpunk clips for me at 450 nits and full screen bright scenes look too dark in all modes except true black 400, just like every other game.
Again, I do not understand why self-appointed adjudicators keep popping up in this thread trying to play detective. We already know what the issue is with the Asus implementation of this panel. We already know why it is happening and what the fix should be. No one complaining about this issue here is questioning the existence or mechanics of the problem.
03-25-2024 07:32 AM
@Rogex47 Turn on Final Fantasy in Console mode, then switch to True Black and it will be twice as bright in the daytime scene 🙂