Nvidia announced DLSS 2.0 on Monday, taking its revolutionary Deep Learning Super Sampling (DLSS) technology to the next level.
Ray tracing may get all the headlines (and the DirectX 12 Ultimate integration), but it’s only part of the appeal for GeForce RTX 20-series graphics cards. DLSS taps the dedicated tensor cores inside the cutting-edge GPUs, using their machine learning chops to boost frames rates and increase the resolution you’re able to play at—a key complement to ray tracing, as the lighting technology severely taxes performance when it’s active.
Nvidia’s wholly new DLSS 2.0 implementation quells pretty much every complaint about the first version, as Nvidia claims DLSS 2.0 is much faster, much better looking, much less restrictive, and much easier for developers to work into their games.
Get this: Nvidia even claims that DLSS 2.0 can deliver images better than native quality, at frame rates faster than native rendering in some scenarios.
Better performance that gets even better over time is part of DLSS’s core appeal. Because it relies on information output from supercomputers, sent down to your gaming PC’s graphics card via driver updates, DLSS can learn and improve long after a game’s launch. We saw that in practice when Battlefield V and Metro: Exodus’s initially ho-hum DLSS 1.0 implementations improved by great lengths after release.
DLSS 2.0—which will soon come to Control and MechWarrior 5—should hit the ground in much better shape. The technology’s upgrades have already been confirmed by independent testing: Nvidia quietly rolled DLSS 2.0 out in Wolfenstein: Youngblood and Deliver Us to the Moon to great effect, as Hardware Unboxed revealed in the video below.
DLSS 1.0 vs. DLSS 2.0
DLSS 1.0 sounded great in theory, but it often stumbled in practice.
Yes, the technology improved frame rates, but Nvidia told gamers that the increased performance would come at no loss to visual quality. That didn’t always happen. BFV and Metro: Exodus looked like Vaseline was smeared across your screen during their initial DLSS implementations, for example. Worse, DLSS 1.0 worked only at certain, preset resolutions tied to your specific graphics card’s capabilities. Even if a game supported lower resolutions, so much of the burden was placed on your graphics card’s tensor cores that it spent less horsepower rendering frames, which meant the performance trade-offs at 1080p resolution were “probably not the best,” Nvidia’s Tony Tamasi admitted in a briefing with reporters.
DLSS 1.0 did cool things, but in the real world, it was far from the revolutionary tech promised at the GeForce RTX 2080 Ti’s unveiling.
Behind the scenes, DLSS 1.0 could only be trained on a per-game basis—meaning Nvidia needed to devote its supercomputer’s prowess to each game that integrated the technology. Making matters worse, most games aren’t perfectly deterministic, Tamasi said. Even if you only change the resolution settings for a game, some systems (like particles) won’t replicate exactly run after run. As a result, the low-resolution images Nvidia used to train its supercomputer wouldn’t always exactly match the company’s ultra-high resolution “ground truth” images, invalidating the data.
Getting DLSS 1.0 working took a lot of heavy lifting on Nvidia’s end, which might be why many of the games that Nvidia claimed would get DLSS wound up dropping those plans.
DLSS 2.0 uses a “fundamentally different” AI model that Nvidia claims is twice as fast as before. It accumulates data from multiple frames over time to generate its high-quality output frame, then feeds that frame back into the model to help inform the next frame. This increases the amount of data going into the AI model, which helps Nvidia generate much higher-quality images. Better yet, the company can use a universal network for DLSS 2.0, rather than requiring per-game training.
The end result? “DLSS 2.0 offers native resolution image quality using half the pixels,” Nvidia claims—meaning it can run much faster at the same image quality, or improve both the image quality and the speed of your game, as shown in the comparison below.
And because the new AI model uses your graphics card’s tensor cores more efficiently, DLSS 2.0 is twice as fast as the original, Nvidia says. Not only does that improve frame rates, but it eliminates the need for those aggravating resolution restrictions so common in DLSS 1.0 games. If you own a GeForce RTX graphics cards, you can play DLSS 2.0 games the way you want to. That’s a major quality-of-life improvement.
DLSS 2.0 will offer three quality modes, based around varying levels of image upscaling. Performance mode uses a 4X image upscale to spit out frames even faster, while Quality mode dials that back to a 2X upscale for optimal visuals. Balanced mode splits the difference. For reference, DLSS 1.0 topped out at 2X upscaling, which is now the minimum offered thanks to the tensor core performance optimizations—and that’s in the image-enhancing Quality mode. Giddy up.
The technology can even make fine details appear more crisp than native rendering thanks to the way the upscaling works, as shown below in Quality mode. DLSS 2.0 is also much better at rendering objects in motion. To illustrate, Tamasi showed a fan spinning behind a mesh gate in Control. As you can see, DLSS 1.0 looked much grainier than DLSS 2.0.
While DLSS 2.0 now uses a universal AI model rather than per-game training, Nvidia says developers will still need to actively integrate the technology. It should be straightforward if a game already supports temporal anti-aliasing (TAA), Tamasi said, and if it doesn’t, enabling DLSS 2.0 is similar to implementing TAA. Epic’s Unreal Engine is adding DLSS as a standard branch soon.
As stated previously, Wolfenstein: Youngbloods and Deliver Us to the Moon already support DLSS 2.0. Nvidia says MechWarrior 5 and Control—already the flagship game for real-time ray tracing—will soon be updated to include the technology. When Control’s new The Foundation DLC launches later this week, the base game will be patched to upgrade its DLSS implementation from 1.0 to 2.0. We can’t wait to try it.