MarshallR@ASUS wrote:
144Hz at 4K is currently impossible until we get more display bandwidth. DP1.2/HDMI 2.0 provides the most bandwidth, but is still not enough for >60Hz at 4K. In fact, HDMI 2.0 has even less bandwidth than older DP 1.2. Apple use a custom solution to drive 5K@60Hz on its iMac.
DP 1.3 will bring more bandwidth suitable for 4K@120Hz:
4K@60Hz requires 11.96Gbit/s, and 120Hz is double that at 23.92Gbit/s, but 144Hz is 28.704Gbit/s, which is beyond even DP1.3 spec.
Since we already push the G-Sync chip to its limit in the WQHD/144Hz Swift (it has to work ~7% harder than 4K@60Hz) you will need to wait for Nvidia's next generation G-Sync chip for this reason and DP1.3 reasons (don't ask me when that is), as well as LCD panel makers to develop fast 4K panels before 4K@ >60Hz arrives.
Yeah, the data rate will always be partitioned between resolution and refresh rate. Those who want the highest resolution available for any link will be stuck on 60hz (or lower). 4K will be out at 120, and 5k monitors will be there for 60hz. When 5k goes to 120, 8k will be available at 60. Apple can do a lot of crazy things when they build the computer and the display together.
I will almost always choose the lower resolution, higher refresh displays for gaming. Comparing the Swift to 4k monitors is fine, but it's like comparing apples to oranges. The Swift is probably the best you can get as a high refresh rate monitor at this point.
Note that while the brain starts losing the perception of individual frames in motion blurred material at about 30hz, and the rods stop sensing flickering about 60hz, there are two major things that make high refresh rate gaming really fantastic. The first is that there are more points in time the game is rendered at, where everything is stopped and fixed, that your eye will blend together into a smoother image (like the illusion of multiple mouse cursors when you move the mouse quickly across the screen, as the refresh rate rises, more and more cursors would appear, fainter and fainter, until it's one motion blurred streak).
The second is how your vision deals with that natural motion blur. If your eye is tracking something onscreen, it creates some motion blur as the eye pans across each still frame (the image of the frame is blurred by it's opposite motion across the retina). The shorter each frame is visible for, the less motion blur. This smooth pursuit is used naturally to prevent motion blur by matching the motion of the eye to the motion of an object, so that it's image remains steady on the retina (usually the fovea), you can then make out details and identify the object. In computer graphics this can be fixed even better with ULMB by strobing the backlight to minimize the distance the eye travels while the frame is visible (the image only appears for a split second, and if your eye is tracking at the right speed, the next will appear for a split second in the same place on the retina creating a clear image for the brain to process).
This all vastly improves the experience, especially in high motion games where tracking an object onscreen, such as a target you are brining weapon sights onto, is important. You can use your instinctual, natural processes for dealing with motion. After having played games at high refresh rates, it would be hard to go back, and I hope in time that 240hz and higher may become available.