Nvidia announced DLSS 2.0 on Monday, taking its revolutionary Deep Learning Super Sampling (DLSS) technology to the next level. 

Ray tracing may get all the headlines (and the DirectX 12 Ultimate integration), but it’s only part of the appeal for GeForce RTX 20-series graphics cards. DLSS taps the dedicated tensor cores inside the cutting-edge GPUs, using their machine learning chops to boost frames rates and increase the resolution you’re able to play at—a key complement to ray tracing, as the lighting technology severely taxes performance when it’s active.

Nvidia’s wholly new DLSS 2.0 implementation quells pretty much every complaint about the first version, as Nvidia claims DLSS 2.0 is much faster, much better looking, much less restrictive, and much easier for developers to work into their games.

Get this: Nvidia even claims that DLSS 2.0 can deliver images better than native quality, at frame rates faster than native rendering in some scenarios.

Better performance that gets even better over time is part of DLSS’s core appeal. Because it relies on information output from supercomputers, sent down to your gaming PC’s graphics card via driver updates, DLSS can learn and improve long after a game’s launch. We saw that in practice when Battlefield V and Metro: Exodus’s initially ho-hum DLSS 1.0 implementations improved by great lengths after release.

DLSS 2.0—which will soon come to Control and MechWarrior 5—should hit the ground in much better shape. The technology’s upgrades have already been confirmed by independent testing: Nvidia quietly rolled DLSS 2.0 out in Wolfenstein: Youngblood and Deliver Us to the Moon to great effect, as Hardware Unboxed revealed in the video below.

DLSS 1.0 vs. DLSS 2.0

DLSS 1.0 sounded great in theory, but it often stumbled in practice.

Yes, the technology improved frame rates, but Nvidia told gamers that the increased performance would come at no loss to visual quality. That didn’t always happen. BFV and Metro: Exodus looked like Vaseline was smeared across your screen during their initial DLSS implementations, for example. Worse, DLSS 1.0 worked only at certain, preset resolutions tied to your specific graphics card’s capabilities. Even if a game supported lower resolutions, so much of the burden was placed on your graphics card’s tensor cores that it spent less horsepower rendering frames, which meant the performance trade-offs at 1080p resolution were “probably not the best,” Nvidia’s Tony Tamasi admitted in a briefing with reporters.

Source Article