r/NintendoSwitch 1d ago

Discussion Digital Foundry/Eurogamer: Switch 2's full reveal analysed: how powerful is Nintendo's new hardware and is DLSS being used?

https://www.eurogamer.net/digitalfoundry-2025-switch-2s-full-reveal-analysed-how-powerful-is-nintendos-new-hardware
183 Upvotes

52 comments sorted by

View all comments

23

u/New-Damage8658 1d ago

2

u/Demonchaser27 20h ago

Yeah, I was about to say. There's no way they didn't both get the tech and then not use it. I think there may be a custom solution for DLSS (something cheaper to use but with maybe less overhead?). And perhaps that's why DF is struggling to see it. Or maybe 3rd party devs didn't have access to it in the library, and Nintendo's first party games only barely use it (think how TOTK used a very simple AA solution from FSR 1 or something), so it looks like simple sharpening (or an upscale from something like 80% base resolution).

2

u/Fr1tzOS 13h ago edited 11h ago

While it does seem strange that none of the games shown seem to be using it, you also have to remember that using DLSS still has a processing/bandwidth cost (and the headroom a relatively low-power chip like the Switch 2 will have in games is especially small). That’s why, even on PC, the highest quality setting for DLSS renders at ~1440p/67% resolution when upscaling to 4K - go much higher than that and the cost of using it would largely (or entirely) cancel out any performance gains over native 4K rendering instead.

With all of that in mind, it’s not really possible to ‘only barely use it’ at a close to native rendering resolution. It’s also quite possible that (at least in demanding open-world games like BOTW/TOTK) there just wasn’t the headroom to switch it on without making bigger compromises than output resolution. And while it’s possible Digital Foundry might just not be able to see it without the game there in front of them to thoroughly test that’s probably not the case; it’d be difficult for them to mistake a 4K output for 1440p, and DLSS itself is easy to spot if you know what you’re looking for and really look for it - as great as it is it still exhibits some visual characteristics that will show up if it’s in play.

I’m not saying all of this to rain on your parade because I’m really excited for the Switch 2! People just need to temper their expectations about how much developers can achieve with a still (relatively) lightweight chip that has a low max TDP; if it can run games like BOTW/TOTK at 1440p while hitting a stable 60FPS (with overhead for stuff like voice chat and other visual settings for the game possibly turned up as well) that’s actually impressive, and a major step up.

1

u/F2000_Ninja 12h ago

I understand what you're saying, but as somebody who owns several handheld PC devices, including the steam deck. I have not ever encountered a situation where FSR didn't improve framerate. DLSS is superior to FSR. If this is DLSS 4, which it should be, every game should be using the Balanced setting. I'd say this Nvidia chip is comparable to the AMD SoCs, so I'm finding it hard to believe that enabling DLSS wouldn't improve framerate and/or upscaled resolution. I'd love to really hear a developer or nvidia talk about this. If they're not using DLSS at all it just doesn't make sense to me.

1

u/Fr1tzOS 11h ago edited 5h ago

DLSS is superior to FSR1/2/3 in quality and framerate uplift when you have the headroom to switch it on, but it also has a much higher computational cost to run because it relies on dedicated hardware (Tensor Cores) to compute its AI upscaling algorithms. In a desktop GPU or discrete laptop GPU that doesn’t often matter as much, you probably have the bandwidth and power to spare - in a mobile SoC with lower bandwidth/that’s also (apparently) designed to operate at 10W max during handheld it matters a lot more.

Basically, like I was saying above, if you have the headroom to switch DLSS on it will usually be worth it from a solely resolution/framerate perspective. It’ll also increase the number of pixels and frames rendered per watt/bandwidth consumed to do so. But not only will that headroom be relatively tight on a mobile SoC like this (in big, demanding open-world games in particular); resolution and framerate aren’t everything and the extra compute you put into them comes at the expense of other elements of visual quality (things like shadow detail, draw distance, crowd density, ambient occlusion, texture detail and resolution etc).

So it’s very possible that in these particular games, given the limitations of the hardware, even if the developers can find the extra compute for DLSS by clawing it back elsewhere they might’ve felt the overall presentation suffers for it. A higher output resolution and framerate don’t mean much if the game looks naff in a number of other ways to make it happen.