Given the lofty price tag, there is a good chance the ASUS PQ321Q is targeting graphics and print professionals, so meeting the sRGB standards of 80 cd/m2 and its custom gamma curve will be important.

Looking at the grayscale first, sRGB is just as good as our 200 cd/m2 target is. The gamma is virtually perfect, and there is no color shift at all. The contrast ratio falls to 667:1, which I expected as the lower light output leaves less room for adjustments. Graded just on grayscale and gamma, the PQ321Q would be perfect.

As soon as we get to the gamut, we see the same issues here as I expected to. That gamut is just a little off which gives us some noticeable dE2000 errors at 100% saturations for all colors.

Here with the color checker charts, we see a large difference between the Gretag Macbeth results and the 96-sample results. The error rises from 1.62 to 2.05 as we are sampling more orange/yellow shades that fall outside of the gamut. Nothing really different than the last calibration, so the same issues apply.

The saturations are also identical to see here. They start out with small errors but by the end, every color except for Cyan is showing a noticeable error at 100%.

For 200 cd/m2 and a gamma of 2.2 or for 80 cd/m2 and the sRGB gamma, the ASUS PQ321Q performs almost equally. The grayscale and gamma are perfect, but the gamut has some issues. Once we start to see more displays using this same panel, but different electronics and possibly different backlights, then we can determine what is causing this shift in the gamut. With the initial target for the ASUS likely being professional designers, these errors seem a bit out-of-place.

dE2000 Data, 200 cd/m2 Calibation Display Uniformity
Comments Locked

166 Comments

View All Comments

  • msahni - Tuesday, July 23, 2013 - link

    very costly..... hope these displays become mainstream soon....
    Higher resolution/ppi does make a big difference atleast for people using their computers all day...
    Even when I jumped from a 1366x768 laptop to a 1920x1080 laptop and then to a rMBP the difference is truly there.... Once you go to the higher resolution working on the lesser one really is a pain...

    Cheers
  • airmantharp - Tuesday, July 23, 2013 - link

    The cost is IZGO; 4k panels cost only slightly more than current panels when using other panel types like IPS, VA or PLS.
  • Death666Angel - Tuesday, July 23, 2013 - link

    And where can I buy monitors with the panels you speak of? I'd like a 4k monitor for about 800 €, maybe even 1000 € (I paid 570 for my Samsung 27" 1440p, so that seems fair if the panels only cost slightly more)....
  • airmantharp - Tuesday, July 23, 2013 - link

    Look up Seiko, they're all over the place. 30Hz only at 4k for now, but that's an electronics limitation; the panels are good for 120Hz.
  • Gunbuster - Monday, July 29, 2013 - link

    Seiki
  • sheh - Tuesday, July 23, 2013 - link

    Why does the response time graph show no input lag for the monitor?

    Can it accept 10-bit input? Does 10-bit content look any better than 8-bit?

    "Common film and video cadences of 3:2 and 2:2 are not properly picked up upon and deinterlaced correctly."

    Why expect a computer monitor to have video-specific processing logic?
  • cheinonen - Tuesday, July 23, 2013 - link

    Because I had to change from SMTT (which shows input lag and response time) as our license expired and they're no longer selling new licenses. The Leo Bodnar shows the overall lag, but can't break it up into two separate numbers.

    It can accept 10-bit, but I have nothing that uses 10-bit data as I don't use Adobe Creative Suite or anything else that supports it.

    The ASUS has a video mode, with a full CMS, to go with the dual HDMI outputs. Since that would indicate they expect some video use for it, testing for 2:2 and 3:2 cadence is fair IMO.
  • sheh - Wednesday, July 24, 2013 - link

    Thanks.

    Alas. It'd be interesting to know the lag break down. If most is input lag, there's hope for better firmware. :)

    Are 10-bit panels usually true 10-bit or 8-bit with temporal dithering?
  • DanNeely - Thursday, July 25, 2013 - link

    some of the current generation high end 2560x1600/1440 panels are 14bit internally and have a programmable LUT to do in monitor calibration instead of at the OS level. (The latter is an inherently lossy operation; the former is much less likely to be.)
  • mert165 - Tuesday, July 23, 2013 - link

    I'd like to know how a Retina MacBook Pro and a new MacBook Air hold up to the 4k display. The Verge com a while back published a demo and the results were not spectacular. Although in their demo they didn't go into depth as to WHY the results were so poor (weak video card, bad DisplayPort drivers, other???)

    Could you connect up the new Haswell MacBook Air to see performance?

    Thanks!

Log in

Don't have an account? Sign up now