This site may earn chapter commissions from the links on this page. Terms of utilise.

For nigh of us, game resolutions increase fairly slowly, one periodic monitor purchase at a fourth dimension. Ten years ago, 1080p was the new hotness; I was personally using a 22-inch Acer with a 1680×1050 resolution, and wondering what all the fuss was nearly with 1080p compared with 720p. At present, 4K displays are becoming more mutual, with additional support for features similar HDR, upcoming back up for FreeSync, and even a whisper of a promise OLED engineering might finally go mainstream in such panels (though to date, no OLED console we're aware of has qualified for the Rec. 2020 standard).

In short, it'south not but game resolutions that are improving. We're seeing measurable improvements in display technology as well. OLED, FreeSync, 4K, and HDR? Sign us upwards.

But for those of you who prefer to game on sheer resolution solitary, Linus Tech Tips has put together an astonishing video of how they managed to build a 16K game rig with $10,000 in GPUs and 16 4K displays built from 16 27-inch Acer Predator panels, in a massive 108-inch diagonal pattern (not counting bezels). The GPUs were powered by a 4 Quadro P5000s with 16GB of RAM each and based on the GTX 1080, but with additional RAM resources, which are kind of required to make this kind of matter functional.

This feat of applied science and power consumption–just the monitors consume 1100W of wall ability–is an impressive look at what may exist possible in the futurity, and the sheer chutzpah of edifice a rig this enormous is impressive in and of itself. At the same time, however, the results demonstrate just how difficult it is to push the envelope in this fashion. Downright aboriginal games, like Half Life ii and Minecraft, were able to play in the 40fps range, but even a game as recent equally Culture Five bogs down at 20fps. Modernistic games, similar Rise of the Tomb Raider, are stuck in the 2-3fps range.

The problem here, I doubtable, is ii-fold. First, and nigh obvious, the aforementioned display engineering science that keeps all of the Quadro cards synchronized and the xvi 4K panels stitched together means the entire game's working prepare has to fit within the 16GB of RAM aboard each GPU. The problem with increasing resolution is you're slamming the gas pedal on how much information must exist stored in memory when rendering over v billion pixels per second, with all the textures, lighting, shading, and detail that goes into the process. Given 16K is 16x more information than 4K, and 64x more memory than 1080p, 15260×8640 is a monster resolution for fifty-fifty the virtually aggressive GPUs.

When Will We Encounter 16K on the Desktop?

Ideally? Never. And no, I'chiliad non just beingness a sourpuss. The bottom line is this: The homo eye is merely capable of resolving and then much detail at any given altitude. There'south a literal, biological limit to the minimum feature sizes we can resolve at any given altitude. Typically, the very, very best of united states hit around 20/8, meaning the hawkeye-eyed amid us tin can see at 20 feet what the rest of us tin can distinguish at eight feet. Our maximum resolvable feature detail increases every bit objects get closer to us, which is why higher resolutions are still useful for VR, even when they tin can't be seen at longer range.

4K viewing distances

The benefits of 4K resolution are largely a function of screen size and how far you sit from it, but the larger your screen is, the further dorsum you need to sit down to see it properly. A 16K organisation with a 109-inch diagonal isn't going to be visible from a viewing distance of 2-4 anxiety, and that's how far back yous'd typically be sitting. That's not to say a huge wall-computer isn't awesome, simply yous don't need it to be 16K to exist similarly wowed.

This nautical chart shows the altitude of televisions and screen sizes as a function for whether higher resolutions are really useful. Typically we've used information technology for 4K, merely at that place'southward no reason you tin can't deploy it for monitors as well–just assume vastly shorter viewing distances than the typical vi-8 feet. And even at such loftier resolutions, there'south just not going to exist much use for them–and that assumes GPUs tin can manage to drive the hardware at all. If information technology takes 1100W to display the 16K image, yous tin can imagine what the GPUs and system are consuming.

The effective cease of conventional Moore's Constabulary scaling means the chances we'll see 16K resolutions, even in VR, exceptionally unlikely. That doesn't mean video applied science can't continue to advance. We've seen a number of impressive technologies come to market, from OLEDs to HDR to FreeSync/K-Sync, none of which put huge burdens on the GPU and all of which improve the experience of gaming. GPU frame rates and RAM buffers will continue to increment. Merely I expect the largest gains over the next few years to be from improvements to colour fidelity, ghosting, larger color gamuts, and game play smoothness, not just relentless focus on college pixel counts.