cancel
Showing results for 
Search instead for 
Did you mean: 

Pg27uq

Neon_Lights
Level 7
MarshallR wrote:
Just checked: PG27 production starts now in May, so mid/late May/early June for first markets depending on how it's being shipped.

Not sure X series yet. It's often mostly waiting for mass-production of panel availability and/or mass production to meet a certain quality standard.

For the record: Monitors don't wait for graphics cards. They just launch on their own schedule.*





My original post:

https://www.inet.se/produkt/2210502/acer-27-predator-x27-4k-144hz-hdr-g-sync-quantum-dot

On this retailer site, the Acer Predator X27 is listed as to be shipped April 5th. Because ASUS usually release their version a little earlier than Acer/the other manufacturers, it is likely that the PG27UQ will be released (assuming that shipping date is correct) during March, this would also still stick to the Q1 2018 release window.


For reference, see the posts in this thread:

http://www.overclock.net/t/1620061/vc-asus-announces-swift-pg27uq-4k-ips-144hz-g-sync-hdr-monitor/10...

http://www.overclock.net/t/1620061/vc-asus-announces-swift-pg27uq-4k-ips-144hz-g-sync-hdr-monitor/11...
277 Views
1,676 REPLIES 1,676

My PG27UQ fans sound nearly like this, in front of the monitor. Who thinks thats silent, has to go to an ear doctor ;).

Maconi wrote:
I haven't tested it myself yet (waiting on my 2080 TI to come in, my current video card can't push higher than 4K 60hz) but I've watched a video where someone used 120hz/4K HDR/4:4:4/8 bit. He even showed how it was displaying 10 bit patterns properly even though the monitor was set to 8 bit which was odd (so in essence he was basically running 120hz/4K HDR/4:4:4/10 bit).



Thanks for the reply and video buddy. It was actually an interesting watch. Although I did note he didn't actually show the test again when he swapped back to 120hz 4k HDR. Although that is just being nitpicky.

I saw a post earlier in this thread saying 120HZ 4kHDR 4:2:2 10 bit is the only possible setting, this seems wrong. After seeing the video i'll have a fiddle around nvidia control panel.

Maconi wrote:
I haven't tested it myself yet (waiting on my 2080 TI to come in, my current video card can't push higher than 4K 60hz) but I've watched a video where someone used 120hz/4K HDR/4:4:4/8 bit. He even showed how it was displaying 10 bit patterns properly even though the monitor was set to 8 bit which was odd (so in essence he was basically running 120hz/4K HDR/4:4:4/10 bit).

https://youtu.be/HRepgcXxeaw?t=13m21s

(skip to 13:21 if it doesn't do it automatically)


What's up guys! First post on ROG forum ever hehe.
When I saw my face, I decided to make an account LOL.

So just to clarify some things, the monitor CAN'T display 120Hz/10bits/RGB color pipeline (equivalent to YCbCr 4:4:4).
However, what I show in the video, is that even in 8bits, once HDR is ON, the Geforce DRIVER or the SOFTWARE does dithering to map the colors to a 10bits surface, which is impressive because the signal going through your DP cable is only 8bits 😛

Someone more savvy than me seemed to confirm this on the Overclock forum.

In other words, I THINK that when HDR is ON, 8 bits or 10bits give exactly the same results on this monitor!
Don't forget that it's not a true 10bits panel so even with a 10bits signal, there is dithering.

I have yet to see someone saying he can actually see a difference between 8 and 10bits with HDR ON on this panel.
No reviewer and no user have said they can actually see the difference.

TL;DR : For SDR, run your monitor in 120Hz/8bits/RGB (144Hz will be ok once we get the firmware update)
For HDR, run your monitor in 120Hz/8bits/RGB or 144Hz/10bits/YCbCr 4:2:2

Yeah, the confusing part is the fact that in SDR you need to set it to "10 bit" to get the 8 bit + FRC (dithering). Yet in HDR you can set it to "8 bit" to get the 8 bit + dithering. It's almost like HDR has dithering built in somehow.

That just means there's no reason for me to ever mess with chroma subsampling. I'll likely just leave it on 4:4:4 and swap between 98hz/10bit for SDR and 120hz/8bit for HDR (unless the game I'm playing is SDR and can somehow handle 120hz @4k lol).

Thanks for the review BTW! It was one of the reasons I bought the monitor. :cool:

I'm glad I could help!

Yeah for SDR don't hesitate to set 120Hz because aneways, SDR games are mastered for a 8bits surface, so 10bits won't make a difference.
I just leave the refresh rate to 120Hz for desktop/SDR gaming and don't have to switch refresh rate each time.

I actually just got a PG27UQ, and I was wondering is there any point in setting it to 12-bit? I know its an 8bit+FRC monitor for effective 10b, but in the nvidia control panel, I have the option for 12 bit at 82Hz (RGB, I refuse to do 4:2:2). I haven't even been able to find a single reference online to this monitor having a 12 bit mode, so I'm wondering if there is any difference? Different LUT in the monitor? Anyone know?

bubba123
Level 8
10:16 says the Acer Predator X27 is faster in input latency vs pg27uq https://www.youtube.com/watch?v=2W9W_dYegn4

bubba123 wrote:
10:16 says the Acer Predator X27 is faster in input latency vs pg27uq https://www.youtube.com/watch?v=2W9W_dYegn4


they are the same monitor with different chassis and different fan.
it's a measurement problem.

Theknystar
Level 9
So hello people!

I got my replacement model... it took a really long time because of the company where i bought it and not because of asus. One and half month.... dmn....

The replaced one have no dead pixels, the fan not loud, everything works as it should be. The backlight bleed is a little worst but nothing outrageous.

Lets hope this wont brake after 2 weeks.

I have nothing more to add, i wrote all my opinions at previous post 🙂 lets wait for the 2x2080 TI to push those frames ^^

ELIESEH
Level 11
If the PG35VQ is at 1500$, I will get it without thinking twice for the following reasons:

I am interested in PG35VQ much more than the PG27UQ because: PG27UQ has a max 98hz for full image quality in HDR (10bit RGB 444) due to 2160p resolution and DP1.4 bandwidth limitation. PG35VQ is 1440p (ultra wide), I am sure we will have 144hz at least without compromises: RGB 444 10bit HDR. I prefer 1440p over 2160p due to higher frame rate in game (less demanding), even 2080Ti is not enough for latest AAA game at 2160p (even 60fps is not achievable), e.g. Assassin creed ODYSSEY, refer to the YouTube benchmark. VA works much better than IPS with HDR and FALD, due to much better native contrast ratio, the halo effect on the PG27UQ is very bad due to the very low native contrast ratio of IPS.
The mega contrast of VA that can achieve approx. near to 0.05cd/m2 of black level at SDR brightness (that is required by VESA HDR specification) even without FALD , this means we will have ultra low to no halo effects.
35 inch is amazing. 2160p is approx. 1.7 times more demanding than 1440p (ultra wide). And if you need to play at medium settings at 2160p, what you will gain in pixel density, you will lost it with graphics quality, 2160p was never a good choice for gaming, when you play at 100 fps and above, you can never go back to 50-60 fps. Smoothness is always a king.

Response time of the VA panel in PG35VQ is much better than any other VA panel in the market, it is a new VA panel model from AUO, because they mentioned 4ms GTG response time instead of all other VA panel in the market where it is mentioned a response time of 5m GTG.
Already watched the video for the X35 ghosting at 200hz on Youtube (UFO test), this is normal at such ultra high frequency, and if this panel is able to go to 200hz, means it is a new technology and at 120-144hz, we will have much lower ghosting than any other IPS, btw also IPS have ghosting, I have now the XB271HU and before the XG2703-GS, and both have ghosting at 144hz.