It’s no secret that 4K has become the new mainstream display technology, no longer sitting at the premium end of the market. It’s on nearly every TV, current consoles support it, and the upcoming Xbox Series X and PlayStation 5 consoles have even more planned for 4K.
Amid 4K’s rise, buzz is already emerging around 8K, the next step in display resolutions. But it’s something gamers should simply ignore. Increasing resolution has its benefits, but they dwindle after 4K, and the many costs of moving to 8K make it not only unreasonable but also a waste of gamers’ time and effort.
There’s more to a game than resolution
There’s a reason the Final Fantasy VII Remake isn’t just a port of the original game offered up at 4K resolution. The textures and models in the original game have low resolutions and limited polygon counts. Take the character model for Cloud, which has few enough polygons that you can count them, and most of which are a single, flat color. Modern game models have polygon counts in the tens of thousands, with detailed textures applied to them. Looking at the character model for Cloud in the FFVII Remake, you’ll struggle to identify individual polygons, let alone count them. As games have advanced, so have the details of game assets. All of those visual enhancements require a bit of a gaming system’s processing power.
Then there are other features, like physics, lighting, and shadows that improve a game’s look and feel, but also require some power. The recent smattering of ray tracing in games is an excellent example that promises to increase the quality of game lighting dramatically by having light rays behave more realistically, bouncing off objects and casting better shadows. But we’ve seen that improvement comes at a serious cost to a computer’s power. On even Nvidia’s RTX graphics cards, which feature purpose-built ray tracing cores, the actual enabling of ray tracing in games can drop frame rates by nearly 50%, and that’s with the game running at just 1080p.
That brings us to frame rate – the aspect of a video game that makes it a video game and not an interactive slideshow. To be considered playable, most games need to hit at least 30 fps, though many gamers push even higher, and many high-end monitors are capable of displaying 144 or 240 fps. The higher the frame rate, the smoother the game, and the more detail you can see in moving scenery. Increasing resolution generally has a big impact on frame rate.
Even the most powerful graphics cards struggle with 8K, as we’ve seen Nvidia’s Titan RTX (a $2,500 card) fail to muster 30 fps in Gears 5 at only medium settings when forced to render at 8K. So, if the Titan RTX can’t do it, the hardware inside most gaming rigs, as well as the Xbox Series X and PS5, aren’t likely to have much luck either.
4K is talked about, but 8K is only mentioned
We’ve seen TV manufacturers showing off their expensive 8K TVs, and we’ve heard Microsoft and Sony mention the support for 8K on their upcoming consoles. But there’s likely a reason we don’t hear more detail about that.
While 8K might be a reality, there’s not a lot of feasibility there. And there are much better things than resolution for hardware and software developers to focus on.
The console manufacturers are aiming for 4K at 120fps. Those high frame rates will make the action in videos much easier to see.
The developer behind Gran Turismo, Kazunori Yamauchi, even said in a recent press event that he believes “display resolution-wise, 4K resolution is enough,” GTPlanet reported. He instead emphasized high frame rates from 120 to even 240 fps.
Sony and Microsoft have also given more concrete detail on just about every other feature aside from 8K. Both of their consoles will support variable refresh rate technology. Ray tracing of both light and sound is also likely to be a standout feature in next-gen games, and it will be a costly one for the hardware.
To deliver all that, there won’t be the headroom to simply quadruple the number pixels the consoles’ graphics processors have to pump out.
Even outside of consoles, 8K talk is quiet. In its recent blog, Nvidia explored the features that are driving gaming forward. Among them were ray tracing and faster frame rates, but there wasn’t even a single mention of 8K.
AMD may be building the hardware that’s going in the new consoles, but even then, the company has shown a reluctance to target even 4K with its graphics processors. AMD’s latest graphics cards are tailor-made for 1440p and 1080p, and when Big Navi launches, it’ll likely be targeting 4K.
You won’t see 8K anyway
Even if you turn down a bunch of settings or get some future hardware that can handle 8K better, it all comes down to this: You probably won’t see an improvement outside of very limited cases. Once technology hit 4K, we reached a point of diminishing returns from further increases in resolution.
That’s because 4K displays already tend to have such a high pixel density that you can’t see the individual pixels, so squeezing more pixels between those already tightly packed pixels won’t make a perceivable difference.
If you have 20/20 vision, you’d need to be less than four feet away from a 55-inch TV for 8K to offer a noticeable improvement over 4K, or you’d need to be about 2 feet away from a 35-inch 8K monitor. Any farther away, which you likely will be, and 4K is going to get the job done.
In fact, you’re much more likely to notice the lower frame rate or reduced graphical settings you sacrificed to achieve an 8K picture.
So, if you dial down your in-game graphics settings and buy expensive hardware that can’t really handle 8K anyway, the only thing you’ll get out of it is several million extra pixels you won’t even be able to see.