Nvidia DLSS 2.0 Delivers Improved Speed, Quality and Compatibility (2024)

Nvidia DLSS 2.0 Delivers Improved Speed, Quality and Compatibility (1)

Nvidia's DLSS stands for Deep Learning Super Sampling (our analysis here), and is a technique that was introduced together with the RTX 2000-series graphics cards in September 2018. Now, Nvidia is introducing DLSS 2.0, which ups the ante further, promising improved performance, customizability, and compatibility.

Deep Learning Super Sampling is a deep-learning-based rendering technique that uses the Nvidia Tensor cores on RTX graphics cards to increase game performance. In essence, it uses deep learning algorithms to render a game at a lower resolution than is displayed, upscaling it to the monitor's resolution. The result is that you'll get a game that looks almost as sharp or good as if it were rendered at the native resolution, but with the improved performance of rendering at a lower resolution.

Technically, DLSS 2.0 has already been out in the wild, but Nvidia didn't make a big deal about the changes or provide any details until now. Nvidia claims that DLSS 2.0 uses a new AI model that runs twice as fast as the model of the original DLSS, which is a rather significant performance boost. Nvidia also promises improved image quality thanks to new temporal feedback techniques, offering sharper and more detailed frames.

One of the biggest changes is that DLSS 2.0 is a general network, and no longer game-specific. This means that whereas DLSS 1.0 had to be implemented by the developer and trained specific to each game, DLSS 2.0 will work without additional training across a much wider range of games. It will still need to be implemented by the developers (which is supposedly relatively simple to do), but as a general AI network, it is much less work to integrate into each individual title.

Nvidia has also addressed one of the primary complaints surrounding DLSS: too much of a sacrifice in image quality versus running the game at its native resolution. In response to this, Nvidia added three modes for DLSS 2.0: Quality, Balanced, and Performance. The performance mode allows for up to a 4x resolution super-sample, which per Nvidia's example, would upscale 1080p gaming all the way to 4K. For comparison, the original DLSS topped out roughly at a 2x-resolution super-sample.

With the image-quality improvements and customization, there now ought to be a setting that works for all gamers. DLSS 2.0 is currently implemented in Control, MechWarrior 5: Mercenaries, Deliver Us The Moon and Wolfenstein: Youngblood, with more titles likely coming soon. Nvidia also made DLSS 2.0 available to Unreal Engine 4 developers.

Paired with the new DirectX 12 Ultimate and DXR raytracing, 2020 is an exciting time for graphics APIs and rendering techniques. With hopefully some new hardware around the corner, 4K gaming is becoming ever-more feasible for the every-day consumer.

Stay On the Cutting Edge: Get the Tom's Hardware Newsletter

Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

Nvidia DLSS 2.0 Delivers Improved Speed, Quality and Compatibility (2)

Niels Broekhuijsen

Niels Broekhuijsen is a Contributing Writer for Tom's Hardware US. He reviews cases, water cooling and pc builds.

See more GPUs News

More about gpus

Nvidia unveils 4K 120FPS AV1 capture and commits to implementing multi-monitor RTX HDR support — new game ready driver adds Pax Dei and Elden Ring ShadowBest Graphics Cards for Gaming in 2024

Latest

Intel CEO says China must make its own chips if sanctions become too restrictive, points to EUV as key cutoff point
See more latest►

11 CommentsComment from the forums

  • cryoburner
    DLSS was a breakthrough, DLSS 2.0 takes it further

    DLSS was useless, at least on current-gen RTX parts. It provided worse image quality and/or performance than simply using existing upscaling and image sharpening techniques that don't require any special hardware. Even Nvidia ended up releasing an updated sharpening filter last year that when coupled with upscaling made DLSS completely redundant.

    This means that whereas DLSS 1.0 had to be implemented by the developer and trained specific to each game, DLSS 2.0 will work without additional training across a much wider range of games.

    This tells us that there is likely no "deep learning" involved, and Nvidia simply gave up on DLSS, replacing it instead with basic upscaling, followed by the aforementioned updated sharpening filter. The tensor cores may not even be getting utilized for it anymore. DLSS was never "super sampling" which implies rendering at a higher resolution and scaling down, but rather the opposite. Now, it likely doesn't involve "deep learning" either, so the name is not at all representative of what it actually is.

    However, it's good to see Nvidia recognize that DLSS was a failure, and replace it with something better. Having a simple-to-use means of upscaling to run demanding games on higher-resolution screens is a good thing. It's also nice to see that they gave it a few quality levels, so one can choose how much of a tradeoff to image quality they want to make a game run better. If this is using traditional techniques, there's no reason it couldn't run on AMD or Intel graphics hardware though.

    Reply

  • bit_user

    cryoburner said:

    This tells us that there is likely no "deep learning" involved, and Nvidia simply gave up on DLSS, replacing it instead with basic upscaling, followed by the aforementioned updated sharpening filter. The tensor cores may not even be getting utilized for it anymore.

    That's not what they're saying.

    NVIDIA DLSS 2.0 is a new and improved deep learning neural network that boosts frame rates and generates beautiful, sharp images for your games. It gives you the performance headroom to maximize ray tracing settings and increase output resolution. DLSS is powered by dedicated AI processors on RTX GPUs called Tensor Cores.

    https://developer.nvidia.com/dlss
    Why are you so quick to write off the potential for AI in upscaling, just because their first attempt wasn't great? Man, if people stopped trying stuff when the first attempt doesn't work too well, humanity would still be limited to digging in the mud with sticks.

    Reply

  • cryoburner

    bit_user said:

    That's not what they're saying.

    It kind of sounds like it to me. Even with a lot of AI training specifically for a given game, DLSS didn't look or perform nearly as well as simply upscaling with traditional methods and applying a good sharpening filter. DLSS 2.0 apparently doesn't require game-specific training, yet it is supposed to look and perform better than the original method. Since the updated sharpening filter that Nvidia recently implemented does just that while using traditional upscaling, it seems a bit convoluted to be using a method involving AI to accomplish the same thing. According to them, they are still apparently using the Tensor cores to perform the upscaling, but if game-specific training isn't involved and performance has significantly improved, it seems logical that they have "dumbed down" the upscaling process to something a lot simpler, and are relying on the improved sharpening filter to do the actual heavy lifting of making that upscaled content look decent.

    bit_user said:

    Why are you so quick to write off the potential for AI in upscaling, just because their first attempt wasn't great?

    I get the impression that the first-gen RTX cards simply don't have enough tensor cores to do the task adequately in real-time, or at least not any better than other methods. And sure, maybe they've improved it with this, but they could have just as easily improved it without the Tensor cores. It's possible that the next generation of graphics cards might have the Tensor performance to actually justify the use of AI in upscaling, but since it requires dedicated hardware, it needs to look substantially better than other upscaling methods at a given performance level to justify its existence.

    Reply

  • bit_user

    Wow, that's quite a lot of opinion to base on so little information.

    I know you staked out a strong position against DLSS, but how can you be so sure about 2.0? You haven't even seen it!

    cryoburner said:

    It kind of sounds like it to me.

    What part of the text I quoted from Nvidia's website sounds like it to you?

    cryoburner said:

    Even with a lot of AI training specifically for a given game, DLSS didn't look or perform nearly as well as simply upscaling with traditional methods and applying a good sharpening filter.

    Deep Learning is complicated and still pretty new. I don't know where you found this confidence in people's ability to do something optimally, on the first try. I haven't seen any basis for it, in my time on this planet.

    cryoburner said:

    Since the updated sharpening filter that Nvidia recently implemented does just that while using traditional upscaling, it seems a bit convoluted to be using a method involving AI to accomplish the same thing.

    Presumably, they think it looks better than their sharpening filter. Indeed, a simple sharpening filter will always have limitations and artifacts, so it's not hard for me to believe a convolutional neural network can do better.

    cryoburner said:

    According to them, they are still apparently using the Tensor cores to perform the upscaling, but if game-specific training isn't involved and performance has significantly improved, it seems logical that they have "dumbed down" the upscaling process to something a lot simpler,

    Sometimes, you can find a simpler method that also works better. The same is true of deep learning - you can sometimes find an architecture and a way of using it that both improves accuracy and efficiency.

    It's not only the design of their network that could've changed, however. They also quite likely improved training and are now using a loss function which doesn't penalize high-frequencies so severely.

    cryoburner said:

    I get the impression that the first-gen RTX cards simply don't have enough tensor cores to do the task adequately in real-time, or at least not any better than other methods.

    Again, how do you know? You're clearly not a deep learning expert. Did you even ask one?

    The 2080 Ti is capable of about 250 TOPS @ 8-bit. That's a staggering amount of compute power. That's 67.8 Million OPS per pixel, at the highest input resolution of 2560x1440, or 30.1 MOPS per output pixel @ 4k.

    cryoburner said:

    And sure, maybe they've improved it with this, but they could have just as easily improved it without the Tensor cores. It's possible that the next generation of graphics cards might have the Tensor performance to actually justify the use of AI in upscaling, but since it requires dedicated hardware, it needs to look substantially better than other upscaling methods at a given performance level to justify its existence.

    You should wait and see it, before making such conclusions.

    Now that you've taken such a strong line against DLSS 2.0, I cannot trust your opinion of it, once it's in the wild and you actually have a chance to evaluate what you've preemptively judged.

    I have to say I'm disappointed. You're better than this.

    Reply

  • cryoburner

    bit_user said:

    What part of the text I quoted from Nvidia's website sounds like it to you?

    They also refer to it as "super-sampling" right in its name, but that's the marketing team's way of describing what is actually the opposite of super-sampling.

    bit_user said:

    Presumably, they think it looks better than their sharpening filter. Indeed, a simple sharpening filter will always have limitations and artifacts, so it's not hard for me to believe a convolutional neural network can do better.

    We can be pretty sure that they are still using the sharpening filter to make the upscaled output look decent, only now they are using the new, more advanced sharpening filter, rather than the mediocre old one. That's likely the biggest change here. DLSS without a postprocess sharpening filter looked incredibly blurry, and sharpening was the only thing making some of the later implementations look half-decent. With the better sharpening filter, they can get away with lower-quality (or arguably no) AI-based upscaling, hence why game-specific training is no longer needed, and why performance has improved.

    bit_user said:

    You should wait and see it, before making such conclusions.

    As the article states, this isn't exactly something that's brand new, they are just marketing it as "2.0" now. Wolfenstein: Youngblood and Deliver Us The Moon have already been using this implementation for a while.

    In any case, I haven't actually taken a strong stance against DLSS, or at least the new implementation. From what I've seen, it appears to work about on par with other decent upscaling and sharpening methods now, and it should be pretty straightforward to use. It is still questionable whether it's doing anything that actually "requires" Tensor cores though, seeing as other methods still achieve similar results. I get the impression that Nvidia is simply keeping that as a requirement in order to push RTX cards, even if the hardware offers minimal benefit to the finished output.

    Reply

  • bit_user

    cryoburner said:

    They also refer to it as "super-sampling" right in its name, but that's the marketing team's way of describing what is actually the opposite of super-sampling.

    Yes, it's a fair point. I imagine they'd say they're using Deep Learning to infer what super-sampled output would look like, but you're right about that.

    cryoburner said:

    We can be pretty sure that they are still using the sharpening filter to make the upscaled output look decent,

    Why do you assume that a neural network can only produce soft output? I think the softness of DLSS was an artifact of the loss function they used for training it. There's nothing fundamental about neural networks that would tend to produce a soft output.

    cryoburner said:

    With the better sharpening filter, they can get away with lower-quality (or arguably no) AI-based upscaling, hence why game-specific training is no longer needed, and why performance has improved.

    Okay, you've gone beyond obstinate. I've tried my best to explain, but it's beginning to feel like a lost cause. If you want to believe it's just a conventional sharpening filter, be my guest.

    Reply

  • cryoburner

    bit_user said:

    Okay, you've gone beyond obstinate. I've tried my best to explain, but it's beginning to feel like a lost cause. If you want to believe it's just a conventional sharpening filter, be my guest.

    Based on everything I've seen about DLSS, yes, I fully believe they are using postprocess sharpening on the upscaled output. Everything seems to indicate that, including what the final image looks like up close. Such sharpening routines have been shown to be a good way of making upscaled content look decent with a very low performance impact, looking and performing notably better than prior implementations of DLSS, so it only makes sense that they would utilize that for the update. They did implement their improved sharpening routine not too long before games started utilizing this updated DLSS, after all.

    There probably is AI processing still going on for the upscaling part of the process, though likely in a simplified form, and it may even handle certain things a little better than other raw, unsharpened upscaling techniques, but the bulk of the visual improvement here over the prior implementation is most likely coming from improvements to post-process sharpening.

    Reply

  • bit_user

    cryoburner said:

    Based on everything I've seen about DLSS,

    Anandtech has a better article, including screen shots and some details about how it works.

    https://www.anandtech.com/show/15648/nvidia-intros-dlss-20-adds-motion-vectors

    Reply

  • cryoburner

    bit_user said:

    Anandtech has a better article, including screen shots and some details about how it works.

    https://www.anandtech.com/show/15648/nvidia-intros-dlss-20-adds-motion-vectors

    It's not a bad article, and at least acknowledges that DLSS 1.x had significant problems and was outperformed by traditional upscaling and sharpening techniques, rather than this article claiming that "DLSS was a breakthrough" and making it sound like it was already unmatched in it's original form, while not even acknowledging the reasons why Nvidia felt a reboot was necessary. This Tom's article reads more like a marketing piece than anything trying to be informative, which was the main issue I had with it, and why my first post might come off as a bit negative.

    Still, I don't see anything in that AnandTech article going against the suggestion that sharpening is being performed on the upscaled output post-process. Yes, the Tensor cores are apparently still performing the actual upscaling, however, that's not necessarily the only stage of "DLSS", as it may be a multi-step process. First, there's the faster, simplified Tensor upscaling, likely followed by another step involving sharpening, which can be efficiently performed on traditional graphics hardware. To the developer, the process likely looks like a single step at the end of the rendering process, but that doesn't mean there are not separate sub-steps being performed on different hardware within it.

    Reply

  • bit_user

    cryoburner said:

    Still, I don't see anything in that AnandTech article going against the suggestion that sharpening is being performed on the upscaled output post-process.

    No, it doesn't and I never said they're not doing it, either. My point was that there's not anything intrinsic about using a neural network that would lead to a blurry output, as you seem convinced must be the case. You're extrapolating from an example of 1.

    The details about it using motion vectors and having to be integrated at the source-level are no doubt also interesting.

    Reply

Most Popular
Nvidia CEO says Samsung HBM3e not yet ready for AI accelerator certification — Jensen Huang suggests more engineering work is required
Asus brings the lightning with hammer-shaped Mjolnir UPS, innovative Thor PSUs
Zotac's Zone handheld gaming PC has adjustable triggers, eye-popping AMOLED display
New TSMC chairman CC Wei brands OpenAI's Sam Altman 'too aggressive for me to believe'
Hyte shows off new coolers, lighting, internal USB header, promises RGB and fan control of anything plugged into motherboard headers
Team Group shows off new PCIe 5.0 SSDs for AI, SSD coolers, and a Portable SSD with a Bluetooth tag at Computex 2024
MSI's Claw 8 AI+ is first Lunar Lake handheld gaming console, comes in Fallout-themed version
Intel unwraps Lunar Lake architecture: Up to 68% IPC gain for E-cores, 14% IPC gain for P-Cores
Qualcomm CEO says Arm taking 50% of the Windows PC market in five years is realistic — some OEMs expect Snapdragon chips to be 60% of their sales within three years
Cooler Master's MasterHUB takes a modular approach to control panels — the Stream Deck competitor will launch this summer
Arm Holdings CEO expects Arm chips to capture 50% or more of PC market space by 2029
Nvidia DLSS 2.0 Delivers Improved Speed, Quality and Compatibility (2024)

FAQs

Does DLSS 2 improve performance? ›

This is incredibly important as 4K gaming monitors become more popular because rendering PC games at that resolution takes a massive amount of computing power. DLSS lightens that load and improves performance.

Does DLSS 2.0 work at 1080p? ›

Having played these games extensively using DLSS at 1080p, our opinion remains largely unchanged from our initial experiences with upscaling at this resolution. DLSS is far less effective at 1080p compared to 1440p or 4K, because of how upscaling at this resolution uses a very low render resolution.

Should I enable DLSS? ›

You may not need to use DLSS in less demanding titles, even if they support the feature. There are definitely cases where DLSS can make a game look better. That's the idea behind DLAA in the first place — the AI-assisted anti-aliasing in DLSS looks fantastic.

Is DLSS 2 available for 30 series? ›

This lets you crank up the settings and resolution for an even better visual experience. *Footage captured in 4K on desktop with GeForce RTX 30 Series, DLSS 2 Super Resolution.

Does DLSS 2.0 work on all games? ›

Does NVIDIA® DLSS support all games? No, not all games currently support NVIDIA® DLSS. However, the list of supported games is continuously growing as more developers integrate DLSS into their titles.

Is DLSS 2 better than DLSS 3? ›

DLSS 2 boosted performance to 84 FPS. That's already excellent, but DLSS 3 put the game into overdrive, accelerating the average frame rate to an incredible 136 FPS. That's a 300 percent improvement over native rendering.

What are the downsides of DLSS? ›

If you're CPU bottlenecked, DLSS won't increase your framerate much, if at all, because it achieves that increased framerate by lowering the real resolution. You'll still see an upscaled image but without the extra frames. If you're CPU bottlenecked, DLSS won't increase your framerate much, if at all.

Does DLSS affect CPU or GPU? ›

When you enable DLSS or FSR, the internal resolution is lowered and upscaled. This, in theory, increases FPS, but this load is moved to the CPU because the GPU doesn't have to work as hard to pump more pixels, the CPU needs to work harder to render those FPS the GPU asks, the result being more CPU usage.

Does DLSS affect visual quality? ›

With NVIDIA DLSS, gamers aren't tethered to native 4K hoping to achieve 50-60 fps. They can render at resolutions like 1080p or 1440p and let DLSS reconstruct the visual data. The outcome? Enhanced frame rates with little to no discernible loss in image quality.

What is DLSS 2 enhanced? ›

Augments DLSS 2.0 by making use of motion interpolation. The DLSS frame generation algorithm takes two rendered frames from the rendering pipeline and generates a new frame that smoothly transitions between them. So for every frame rendered, one additional frame is generated.

How do I activate DLSS 2? ›

Instead, you must enable DLSS in each game through its settings menu.
  1. First, launch a game that supports DLSS.
  2. Open the Settings menu and go to the Graphics tab.
  3. Toggle DLSS to On. If you don't see DLSS, then you may first have to select an upscale method. ...
  4. Finally, play the game and enjoy the performance improvement.
Mar 26, 2024

What is the best graphics card in the world? ›

The best graphics card right now is Nvidia's GeForce RTX 4090 and there's nothing subtle about this ultimate gaming performance monster.

Does DLSS 2 increase input lag? ›

Only FSR 3 and DLSS 3 increase input lag. In general FSR 2 and DLSS 2 lower input lag because more real FPS = lower input lag (when comparing native resolution to the upscaled result). The exception being when you are heavily CPU bottlenecked (like using DLSS at 1080 with a high end GPU).

Does DLSS help CPU performance? ›

DLSS reduces load on the gpu and therefore puts more onus on the cpu.

Which GPU can use DLSS 2? ›

You'll need a GPU from the GeForce RTX lineup in order to use any of the Nvidia DLSS features.

References

Top Articles
Latest Posts
Article information

Author: Tish Haag

Last Updated:

Views: 6130

Rating: 4.7 / 5 (67 voted)

Reviews: 90% of readers found this page helpful

Author information

Name: Tish Haag

Birthday: 1999-11-18

Address: 30256 Tara Expressway, Kutchburgh, VT 92892-0078

Phone: +4215847628708

Job: Internal Consulting Engineer

Hobby: Roller skating, Roller skating, Kayaking, Flying, Graffiti, Ghost hunting, scrapbook

Introduction: My name is Tish Haag, I am a excited, delightful, curious, beautiful, agreeable, enchanting, fancy person who loves writing and wants to share my knowledge and understanding with you.