cancel
Showing results for 
Search instead for 
Did you mean: 

PG32UCDM Console Mode HDR issue

FallenDeity
Level 8

My and other's PG32UCDM monitors have a clipping issue on PC when using the console HDR mode in Windows 11. Currently, the monitor clips at 450 nits in the Maximum Luminance window but it clips correctly in the Max Full Frame Luminance window. Is there a firmware update in the works for this or is this an RMA issue?

2,591 Views
678 REPLIES 678

RCLandroid87
Level 9

I just double checked, mine does it too in console HDR when using Windows HDR calibration app. It clips around 430 nitts in the max luminance window and it goes all the way to 1,000 nitts in the full frame luminance test. WTF that's a head scratcher for sure! Hopefully ASUS gets some firmware out that allows the monitor to work correctly and calibrate correctly in windows 11 23h2 when using AUTO HDR/ and HDR Enabled. 

Yadooo
Level 7

I have the same exact issue as described by that brightness stress test. Console HDR is darker than 400 true black which is very strange. In my case, I was capped at 430-445 nits so I forced windows to output 1000 nits using the calibration tool (it mentions 1000 nits in the display settings now) but it still has that clipping issue. Definitely need a firmware update to fix this as I have tried all the suggestions proposed. 

wraith321
Level 9

I am just wondering if this is to be expected. I feel like there is a trade-off between the true black mode having weaker highlights, but more details ("brightness") in darker areas.

Otherwise, what "should" the benefit of the true black mode be? or rather what's the expected outcome.
I know that there is a problem when viewing SDR content on HDR which is also quite a bit brighter on the true black mode.
Could this just be related to the fact that it's supposed to be capped at 400 nits, which allows it to raise the brightness for darker regions because it simply doesn't have the same peaks as the other modes?
Just trying to somehow make sense of all of this.

I do have problems with the different HDR modes as well, which might be related or maybe even the same problems.
But the behavior is so wonky, it's actually quite hard to tell.

Yeah, I'm definitely worried that this is "intentional", but also very misleadling and should at least have an alternative mode to handle it differently.

The way I think HDR 1000 mode "should" ideally work is to not permantly lower brigthness in low/medium brigthness scenes, but rather bring the brigthness down when the image exceeds ABL limits. So kind of how you would expect ABL to work.

To me it seems like there are two ways to design this:

  1. The way it currently works: All HDR content that doesn't have bright highlights is permanently dimmer in Console HDR mode, just so that it can then boost hightligts without dropping brightness elsewhere if bright highlights appear in the scene.
  2. Or alternatively it could just keep medium brigthness content at same level as True Black mode, but if highlights appear ABL would only then dim the overall image in order to boost the highlight intensity.

I think the current way is problematic (and misleading) in couple of ways:

  • It doesn't show up in typical monitor review test patterns (who usually test with high 4000-10000 nit patterns for the 2-50% window size tests). So a consumer might think "Oh, this new OLED manages 250 nit brightness even in 50-100% window sizes!", only in reality it dropping to 120 nits (UNLESS there are bright highlights in the image to balance it out as average brigthness). (Or unless you only use the True Black 400 mode, which then in return no longer has brighter than 480 nit highlights...)
  • And, it also introduces a nasty compromise you have to make when watching HDR content or playing HDR games:
    • If you play in the "Console HDR" mode with peak 1000 nit highlights, then scenes where there are brigth highlights such as sun reflections or sunset, it can look really good. But in return any scenes without highlights such as cloudy weather or many indoor scenes are permanently dimmer than they "should" be according to the advertised specs.
    • Or you can play in "True Black 400" mode and have correct brigthness for those cloudy non-highlighty scenes, but in return never get the advertised 1000 nit highlights when they appear.

So yeah... I definitely hope they at least add an alternative HDR mode that doesn't dim everything at lower/medium brigthness levels, but rather only activates ABL when the scene or highlights get so bright that it is required - even if it can in such situations result in shifting luminance levels elsewhere in the image.

The Alienware and MSI model didn't have the same behaviour so it might not have been intentional. Even in Hardware Unboxed's review of this monitor, one of the tests shows it is significantly dimmer than the AW and MSI models in HDR, but looked fine in the next result, it looked like a major outlier so I'm not sure why Tim from Hardware Unboxed didn't investigate the results. https://www.youtube.com/watch?v=qywLwR7KT9M&t=1550s

 ApolloEleven_0-1711131248304.png

 

Support us on Patreon: https://www.patreon.com/hardwareunboxed Join us on Floatplane: https://www.floatplane.com/channel/HardwareUnboxed Buy relevant products from Amazon, Newegg and others below: Asus ROG Swift OLED PG32UCDM - https://geni.us/wN9RQ Asus ROG Swift OLED PG32UCDM (Asus link) - ...

Seems like the panel characteristic. The ture black 400 clip the brightness on 450nits in small APL size, then provide the more brightness on large APL. So the most of video look brighter. The other HDR mode boost the brightness to 1000nits on Small APL then the large size APL got the brightness limited.

I had an MSI 32 with the same panel. Everything was as it should be there. So it's not the panel's fault, it's Asus's. 

14900KS :: RTX 4090 :: Aorus Master X Z790 :: 64 GB RAM

That's definitely not how the panel is supposed to function. No such issues on the MSI or Alienware variants.

High APL on Console HDR mode (and other modes, except for TB 400 mode) should not be limited to 100 nits, ridiculous to assume that's normal behavior.

ApolloEleven
Level 10

Just to reiterate, you can see this is bugged in Monitors Unboxed's review, ASUS is at the bottom of the list: https://youtu.be/0ssesoCm4lU?si=Kp1L6WsEWbwF1X6C&t=1424  

ApolloEleven_0-1711131635388.png

 

Kundica
Level 10

In response to OP...

ConsoleHDR clips at 450nits in the Windows HDR Calibration Tool because that mode is designed to hard clip at peak luminance in a 10% windows, and the Windows tool uses a 10% window to help the user determine max luminance. GamingHDR and CinemaHDR both use tonemapping, so when you use the Windows HDR Calibration Tool you'll get over 1000nits. This is expected behavior between tracking designed to hard clip in a 10% window and one that tonemaps. It's also worth mentioning that the result of the Windows HDR Calibration Tool doesn't prevent the monitor from reaching over 1000nits peak brightness in ConsoleHDR mode.

In order to determine this, I went through and measured all three 1000nit modes in 10% and 2% windows to see how they track PQ and to diagnose any tonemapping. I also forced the Windows HDR Calibration tool to 200nits and 1000nits in ConsoleHDR mode, in addition to the 450nits it measures.  You can see in the attached measurements that ConsoleHDR mode calibrated to 200nits and 1000nits in the Windows tool track PQ exactly the same, reaching a maximum peak brightness of over 1000nits in a 2% window. Calibration to 450nits in the tool also measures the same, but I only included the two extremes in my image. It's also clear that ConsoleHDR mode measured with a 10% window tracks PQ as expected hard clipping at 467nits.

Moving on to GamingHDR mode in a 10% window you can see that it tonemaps to reach a peak brightness of 467nits. CinemaHDR(not included in my image) is the same with  more rolloff. The 2% window measurements of CinemaHDR and GamingHDR show that they both attempt to loosely track PQ to the full 1036nits peak brightness they measure. This is in stark contrast to ConsoleHDR 2% window behavior.  I've seen a lot of people reference that other 4k 32" QD-OLED monitors don't clip at 450nits in the calibration tool, and that's because their 1000nit modes are tonemapping like GamingHDR and CinemaHDR on the Asus.

What doesn't make sense and I don't have an answer for, is why the max full frame luminance test clips at 1000nits. A mismatch like this is typically something you'd see with some sort of funky dynamic tone mapping. My suggestion is, if you want to use ConsoleHDR mode and the Windows HDR Calibration tool then I'd set both to 1000nits. Just know that ConsoleHDR mode doesn't track PQ well past 300ish nits to reach 1000. It'll still do 1000nits, just not tracking PQ. It's probably not a huge issue since we're talking about specular highlights in games, but it's worth mentioning.

Regarding the separate concern about 1000nit modes being dimmer than the TrueBlack400 mode... It's apparent that ABL is behaving much differently between the two peak brightness modes. That doesn't necessarily mean it's broken, but it's certainly worthy of discussion given how aggressive the dimming is compared to other monitors using this panel. It's important to draw the distinction between OP's concern, which is a result of a difference between hard clipping/tonemapping, and ABL impacting the 1000nit/TB400 modes. There's a lot of chatter about both issues on this forum and reddit, so I feel it's important for everyone to be on the same page about what's being discussed.

Here are my measurements. It's worth noting that my panel's white point measures around 6200k for any mode using 6500k(all HDR modes, the 6500k color temp reset, and sRGB CAL mode), which is why there's distinct difference with blue in my RGB values.

WCT_Compare_all.jpg