Asus readies 4K "ultra HD" monitor

Asus is readying a 31.5-inch display with 3840x2160 pixels, four times the pixel count of a standard HD display.

The panel uses Indium Gallium Zinc Oxide ("IGZO") instead of silicon, which allows for smaller transistors and, hence, the greater pixel density. Asus claims a 176-degree viewing angle and 8ms response times.

The PQ312 has displayport and dual HDMI connectors, stereo speakers, and 3.5mm audio i/o. It weighs 13kg and is 35mm thick.

Pricing hasn't been announced, but Asus is well-known for its competitiveness: the Taiwanese company's well-liked 27" 2560x1440 display is a relatively cheap $633 at Amazon.

At 140 DPI, it's not quite up to the "retina"-style resolution of recent cellphones and laptops—but then again, you're probably not holding it 12 inches from your mug.

IBM has had a similarly high-resolution monitor out, the T221, for several years. With slow refresh rate and the dimensions of a giant's briefcase, though, it was too slow and ugly to appeal beyond the business niche. Asus's PQ321 will be a relatively slim 35mm.


  1. Finally!  Though I don’t know why they need new tech if they can get that dpi on cell phone screens

    1. The new tech may be puffery about an incremental advance used for other reasons.

      It could also be that the reason why we usually only see very high DPI in small devices is defect rate:

      If your process, unavoidably, results in X defects across a sheet of glass that is being processed into a hundred smartphone displays, it’s perfectly possible to get decent yield even with moderately high values of X. If that same amount of screen area is going to be chopped into two desktop panels, on the other hand, you could easily end up making more scrap than product.

      1. Yep that’s the only thing that makes sense to me, but for whatever reason they’re not advertising this as a process with fewer defects – they’re just saying it allows smaller transistors.  Cell phones have already solved the small transistor issue across several technologies.

      2. Any reason they cant chop up the working large sections into large monitors and the subsections that work into smaller displays, like a puzzle?

        1. I don’t know, I’m afraid, exactly where in the process the final geometry decision has to be made.

          At some point, you can’t subdivide any further(if you look at the edges of an LCD panel, stripped of its frame and drive circuits, the traces on the glass at the edges differ from those within the image area, so you would certainly have to decide before those are laid out). It’s possible that production-line standardization demands keep you from handling larger screens and smaller ones efficiently on the same line.

          With silicon chips, they can often salvage a partial-good part quite late in the game, but those are both much more expensive per unit area, and you don’t actually have to physically remove a defective area, just cut its power supply, which makes things less challenging.

    2. Ah..  And with windows 8, you can even have as many as 4 metro apps open at the same time..  and you can see the app menu on the whole screen…

    1. You should write a letter, or realize that they are using a term that is standard across an industry, or both!

      1. There are various 4K formats, the actual standard which the film industry uses, and the Ultra HD standard where they adopted the 4K name, to (and I’m guessing) be less confusing. At the very least, it’s misleading.

    1. That Depends(as they say):

      HDMI 1.4 has sufficient bandwidth to drive a display of this resolution at 30Hz, and devices conforming to that spec should do so. Some reasonably recent video cards with ‘DVI’ outputs may actually have HDMI interfaces on DVI connectors, for compatibility purposes. If your card is one of these, and the HDMI standard is recent enough, a simple mechanical adapter should do the trick A ‘true’ DVI connector, or a pre-1.4 HDMI-over-DVI connector will, at best, drive this at reduced resolution, possibly not at all.

      Displayport, for the most part, should be OK. It specifies v1.2; but initial uptake was… tepid… so there aren’t too many earlier products actually in the wild.

  2. 8 ms response time?  Seems slow enough for gamers to complain about – and I would think that would be a hefty slice of the target market.

  3. It’s pretty surprising to me that even basic HD sets are coming with less than three or four HDMI inputs then you have something like this which seems designed to attract the videophile who would have a lot more than two HDMI devices to connect.

    How much can it add to the cost to install more HDMI ports? I’m certain that someone dropping the multiple thousands large format HD sets go for isn’t going to want to clutter things with whatever the HDMI equivalent of a KVM switch is.

    1. In the case of A/V enthusiasts, the receiver(in addition to its historical sound-related duties), usually supplies HDMI switching.

      Something like this(no specific endorsement, product is however representative) will give you 5 HDMI ins, in addition to a variety of analog inputs(which I think it can transmogrophy into an HDMI-out so you only need a single connection to the TV/Monitor)

      I suspect that that depresses the demand for TV/monitors with 5-6 ports somewhat: the high-rollers are going to be using a receiver anyway, for various flavors of advanced audio decode, and to drive something less awful than integrated speakers, so they don’t have any real reason to pay more to get extra ports on the TV. The cheapskates can get a 5-port HDMI switch at under $50, and they are probably the ones who will judge the price bump for adding it (more elegantly) directly inside the monitor most harshly.

      The people using it as a monitor might value the picture-in-picture capabilities and second port if they have a game console, or occasionally watch some TV or something; but if they are switching between multiple computers, they’ll need a KVM anyway to handle the keyboard and mouse side of things.

  4. Does anyone actually think this will make a perceptible difference in quality over standard HD, to an average human at an average viewing distance?

    1. Many people think it. That doesn’t mean its true.

      People will be buying 4K tvs and streaming the same low-bit-depth content on them, and swearing up and down that their lives are better because of it.

    2. Yep. Higher pixel densities mean it’s harder to see aliasing, and bigger pixel dimensions mean you can fit more windows of the same size on the desktop.. both of which are useful to a person using a computer.

      It’s not all about playing movies, you know. 

    3. Is there any perceptible difference between a 1200 dpi laser printer and 300 dpi ink jet?

      I’ve been watching ASUS roll out this monitor and plan on pre-ordering two for my work set up. 

    4. Nope.  Sure in the store where you can literally stand 3 feet from a 60″ screen with 4k content it’ll look better than real life.  In my living room where it’s 10+ feet away (18 for me) it’ll look no better than HD, or in my case 480p.

      And like Judonerd said a lot of it comes down to bandwidth not just resolution.  Regular digital HD (1080P) looks pretty good, but if I watch it on my computer it looks bad – in fact it looks worse than a DVD upsampled to HD resolution.  It looks bad because of the artifacting and macroblocking due to not enough bandwidth/over compression of the source.  I’m sure 4K is only going to make it worse.

    5. One can make the argument that, in a living room environment, 4k makes no sense for anything under about 70″.  

      But this isn’t meant for a living room. Since the average viewing distance for a computer monitor is less than two feet (and this is a computer monitor), I DO think it will make a perceptible difference.

    6. The proper viewing distance is a function of resolution and display size. At 8K, the proper viewing distance is actually less than the height of the display!

       Yes, it will without a shadow of a doubt make a difference for a desktop user. If you were to use it with the typical viewing distance for a TV – eg something else than it was made for – the difference would not at all be noticeable.

  5. IBM has had a similarly high-resolution monitor out, the T221, for several years. With slow refresh rate and the dimensions of a giant’s briefcase, though, it was too slow and ugly to appeal beyond the business niche. 

    That business niche being radiography, and to a certain extent, prepress.

  6. There are a few of these 4k monitors around where I work. They are are rather nice. I do work for a company that creates video editing hardware and software and we have a cinema / editing suite with a 4k / 3D projector too.

    The last 4k monitor  I worked on was hooked up to a pc with an Nvidia Quadro card and two Nvidia Tesla cards, all top of the line too…. They wouldn’t let me install any games on it :-(

Comments are closed.