The revolutionary PC gaming tech developers are ignoring

Variable Rate Shading, or VRS, is a major piece of graphics tech that PC games have largely ignored for the past three years. It works on all modern AMD and Nvidia graphics cards, and it has a simple goal: Improve performance by as much as 20% without any perceivable drop in image quality.

Sounds amazing, right? Well, there’s a reason you probably haven’t heard much about it. The last couple of years have focused on Nvidia’s Deep Learning Super Sampling (DLSS) and AMD’s FidelityFX Super Resolution (FSR) as the performance-saving champions of the modern graphics era. And although they offer the best bang for the game developer’s buck, VRS is an equally impressive tool that’s been woefully underused.

Variable Rate Shading: Not new

Microsoft / The Coalition

VRS isn’t new — Microsoft’s blog post announcing the feature in DirectX 12 is over three years old. If you’re not familiar, VRS changes the resolution at which shaders are applied within a scene. It’s not changing the resolution of the game; VRS simply allows neighboring pixels to share a shader rather than having the GPU do redundant work.

If there’s a corner of a scene wrapped in shadow without a lot of detail, for example, your graphics card doesn’t need to calculate the light, color, and texture values for each pixel. It can save some hassle by grouping them together — four pixels in a 2×2 grid may have extremely similar shading values, so VRS kicks in to optimize performance by only calculating one shader and applying it to the rest of the grid. The size of the grid is the shading rate, and more pixels in a grid means a lower shading rate.

That small change can make a big difference in performance. In Gears Tactics at 4K, for example, VRS offered a 22.9% increase in my average frame rate. That’s the best example, but Resident Evil Village also showed a 9.8% increase in my average frame rate, while Hitman 3 offered a solid 8% boost. And the idea behind VRS is that it should be indistinguishable when it’s turned on, essentially offering free performance.

VRS performance in three video games.

There are only a small number of games that support VRS on PC, despite it being more than three years old. I’ll address that issue later in the column, but the more pressing issue is how VRS is used among the few games that support it.

There are two buckets for VRS: One that makes it look like a revolutionary piece of kit that offers free performance, and another that makes it look like a feature that hurts more than it helps.

Two worlds of VRS

A debug screen for VRS in Dirt 5.

Microsoft has two tiers of VRS in DirectX 12 Ultimate: The aptly-named Tier 1 and Tier 2. Tier 1 VRS is the most common technique you’ll find in games, which is the heart of the problem. This level doesn’t concern itself with individual pixels, and it instead applies different shading rates to each draw call. When there’s a call to draw background assets, for example, they may have a 2×2 shading rate, while assets drawn in the foreground have a shading rate of 1×1.

Tier 2 VRS is what you want. This is far more granular, allowing the developer to shade within a draw call. That means one part of a model can have a shading rate of 2×2, for example, while a more detailed area on that same model could use 1×1. Tier 2 VRS is ideal, allowing the developer to focus on the details that matter to squeeze every ounce of performance out.

VRS comparison in Resident Evil Village.
Left: VRS Off, Right: VRS On

The problem: Even among the small pool of games that support VRS, most of them only use Tier 1. Resident Evil Village, the most recent game I looked at, uses Tier 1 VRS. You can see how that impacts the image quality above, where you can make out pixels in the snow as Tier 1 VRS lumps together everything a few feet away from the camera.

Contrast that with Gears Tactics, which supports Tier 2 VRS. There’s a minor difference in quality when zoomed in to nearly 200%, but it looks much nicer than Tier 1. You can spot a difference when the two are side-by-side and zoomed in, but put these two frames back to back in a blind test, and you wouldn’t be able to tell a difference. I certainly couldn’t.

VRS comparison in Gears Tactics.
Left: VRS Off, Right: VRS On

Free performance for virtually no loss in image quality is a huge deal, but on PC at least, VRS isn’t in the conversation as much as it should be (let alone the discussion between Tier 1 and Tier 2). Even after moving Gears Tactics and Gears 5 to Tier 2 VRS, developers haven’t jumped on the performance-saving train. Instead, VRS has mostly focused on the limited power budgets of consoles, and there’s one particular console holding the feature back.

A console blockade

A PS5 standing on a table, with purple lights around it.
Martin Katler/Unsplash

The reason VRS comes in two flavors is that Tier 2 requires specific hardware to work. Nvidia’s RTX graphics cards and AMD’s RX 6000 GPUs have hardware support, as does the Xbox Series X. Older graphics cards and the PlayStation 5 do not. Instead, they use a software-based version of Tier 1 VRS, if it’s even available in the game at all.

Developers working on multi-platform titles are usually going to focus on the lowest common denominator, which means Tier 1 VRS. There are only a few developers who have gone out of their way to support Tier 2 VRS on supported hardware (id Software uses Tier 2 VRS on Doom Eternal for the Xbox Series X, for example), but the vast majority of modern AAA games either don’t support VRS or use this Tier 1 approach.

As Gears Tactics shows, a proper Tier 2 implementation from the developer offers the best image quality and performance. It’s true that DLSS and FSR provide an easy solution for developers to improve performance in PC games. But proper Tier 2 VRS can represent around a 20% boost for barely any difference in image quality, and that’s too good to ignore.

This article is part of ReSpec – an ongoing biweekly column that includes discussions, advice, and in-depth reporting on the tech behind PC gaming.

Editors’ Choice

Repost: Original Source and Author Link


Mac OS X Is 20 Years Old. Here’s Why It Was So Revolutionary

Today marks the 20th anniversary of Mac OS X, the Mac operating system that changed everything. Arriving only a couple years after the first iMac, it helped forge Apple’s image as the king of cool — and changed computing forever.

At the turn of the millennium, Apple was the talk of the tech world. The company had nearly gone bust before Steve Jobs’ dramatic return in 1997, but just a year later, it launched the playful, colorful iMac G3 to massive acclaim. While the hardware felt downright space age, the operating system looked dated, full of dull grays and boxy windows.

Then along came Mac OS X in March 2001. Sporting the bright, bold Aqua theme with its drop shadows and translucent blue menus, it was the visual makeover Mac software so desperately needed. Now, using the computer felt as good as looking at it.

The attention to visual detail helped define Apple as a company that “got it,” one that flaunted its design prowess as if no one were looking — but in reality, the whole world was. As the iMac G3 had before it, Mac OS X showed that computers need not be the exclusive domain of nerdy techno-shamans and button-down businessmen. Instead, they could be fun, frivolous, and feisty. In other words, they belonged as much in the living room as they did in the boardroom.

This was the era of the “sunflower” iMac G4, the Power Mac G4 Cube, the iPod. Apple was miles ahead in the hardware design world, and once Mac OS X came along, it could rightly claim that was true for software as well. Yet this was not inoperable, impenetrable software laden with jargon. It was something anyone could use. As Steve Jobs famously said: “It just works.”

Beauty and brains

What made Mac OS X so great was not just its gorgeous looks — it also had a ton of awesome features. The Dock? That was Mac OS X. The Mail app? That too. It saw the launch of other apps like Address Book (now known as Contacts) and TextEdit that are still with us today. And it was underpinned by key features like the Terminal, OpenGL graphics support, AppleScript, protected memory, and many more. It was not just a pretty face — it had the brains, too.

In the years since Mac OS X launched, it has gone through many changes, including its name, which switched to OS X in 2012 and macOS in 2016. Gone is the price tag (the first version of Mac OS X cost $129) and the PowerPC processors, with Intel on the way out, too. But those are all, arguably, positive losses. Mac OS X has lost remarkably little of what originally made it great 20 years ago.

These days, there is a renewed focus on the Mac after many years seemingly in the wilderness, largely thanks to the shot in the arm Apple’s M1 chips have given the platform. Here’s hoping we are on the brink of another golden age of the sort we enjoyed under Mac OS X.

Editors’ Choice

Repost: Original Source and Author Link