AMD’s Answer To Nvidia’s DLSS, FSR, Works On All GPUs

AMD’s Answer To Nvidia’s DLSS, FSR, Works On All GPUs
Image: AMD (YouTube)

Nvidia’s DLSS isn’t foolproof, but when it works the performance benefits of AI-powered upscaling can be astonishing. It’s why fans have been pushing AMD for almost a full year to come up with an AI-powered solution of their own, and today, AMD finally announced it. Even better: it’ll work on any GPU hardware.

Termed FidelityFX Super Resolution (FSR), the technology is basically AMD’s response to Nvidia’s deep learning super sampling (DLSS). But as consumers saw when DLSS was first unveiled, AMD’s technology won’t be broadly implemented to begin with. Only 10 games and game engines will support AMD’s FSR this year, although AMD’s hope is that their open-source approach will lead to faster adoption in the long-term.

And AMD might be right, especially since AMD’s FSR doesn’t just work on AMD hardware — it supports Nvidia cards too. In an embargoed briefing, AMD showed a performance slide from Godfall where it claimed a Geforce GTX 1060 — the most popular GPU according to the latest Steam hardware surveys — went from 27 FPS to 38 FPS while running at 1440p on Epic settings.

amd
Image: AMD

It’s a bit of a weird test, and AMD’s footnotes added that the testing machine was a Ryzen 5950X CPU running the October 2020 Windows 10 update — but not the May 2020 update, which AMD used for another benchmark slide comparing FSR’s performance in different modes. You wouldn’t play a game like Godfall at less than 60 FPS anyway, even on console, given its nature.

Journalists asked during the briefing whether FSR had any resolution restrictions, as Nvidia’s DLSS technology did in its initial implementation. AMD said they would provide more detail on June 22, or June 23 Australian time, which is when FSR is due to launch.

That’s also when we’ll get the full list of games and engines that support FSR out of the game. If Godfall is any indication, there’s a lot to look forward to. The very good 6800 XT, under AMD’s testing, only runs at 49 FPS in Godfall when ray-tracing and all settings are maxed out.

But with FSR enabled, that frame rate jumps to 78 FPS at the lowest upscaling setting (Quality) and 150 FPS at best (when Performance is enabled).

amd
Image: AMD

AMD was pressed on the rendering resolution for each of the settings. Nvidia’s DLSS generally can render up to 4x the internal resolution, so a 4K game running in Performance mode would be using AI to upscale the image from 1080p. (Some titles, like Cyberpunk 2077, have an Ultra Performance mode available which upscales the images by a factor of 8x. It’s not really worth it though — you’re better off playing at 4K due to the loss of image quality and the hit to frame rate.) AMD, however, added that more detail would be announced on June 22 (June 23 Australian time).

The company also used their virtual Computex keynote to unveil the next generation of Ryzen mobile hardware, the 6800M. 6700M and 6600M. The top of the stack, the 6800M, would ship with 40 compute units, 12GB of GDDR6 memory, 96MB of infinity cache and a boost/Game clock speed of 2300MHz, which is pretty fast for something stuck inside the compressed chassis of a laptop.

amd
Image: AMD

It’s being pitched as a high powered 1440p card, with AMD specifically calling out frame rates above 120 FPS. Most of the titles mentioned were either esports focused games or older AAA releases like Battlefield V, but AMD did showcase another slide claiming the 6800M was either comparable to, or just ahead, of Nvidia’s RTX 3070 and RTX 3080 cards.

Image: AMD
Image: AMD

The newer AMD laptop, according to the appendix, was a Ryzen 9 5900HX ASUS ROG Strix, while the Nvidia-powered laptops were the ASUS ROG Strix Scar GS333QS and the ROG Strix G513QR.

Laptops with the new AMD mobile hardware will start shipping worldwide from this month, with models from Lenovo, MSI, HP’s OMEN brand and ASUS ROG, although Australian availability is usually a little behind. All the new laptops will also support the same Smart Access Memory feature that’s enabled on Ryzen desktop CPUs, although as we’ve seen there, its benefits can vary greatly from game to game.

Still, it’ll be interesting to see just how difficult FSR is to implement when AMD reveals more details in a few weeks. We’ve seen DLSS takeup accelerate a lot lately ever since Nvidia released an Unreal Engine plug-in to make it easier on developers. If AMD can integrate its tech seamlessly enough into major engines like Unreal, Unity and some of the bespoke offerings like Frostbite, AMD’s AI-powered upscaling might start to make a difference much sooner than many had hoped.

Comments

  • The 1060 comparison image is not kind, and reminds me of DLSS 1.0 in someways….

    This being said, its important to note that AMD’s solution isn’t AI, and isn’t Temporal, its Spatial. Upside, there’s no frame delay so image persistence should be excellent. Downside, spatial upsampling usually doesn’t accumulate frame date (unless the algorithm is adaptive), so image quality is likely to be lower. The effect of this can be seen in DLSS in when you update the entire image (by spinning) and when you stop there being a momentary ‘snap’ to full quality as DLSS accumulates previous frame data. That being said, nothing stopping one applying TAA on Spatial upsampling to achieve a quasi-like effect.

    FSR being available everywhere is likely the biggest thing that will kill DLSS off however, no need to spend time integrating one technology for one platform when you can build and ship once.

    • If it turns out to be an inferior solution it’ll be a damn shame if it kills off DLSS, just because it is simpler to implement the ‘one size fits all’ fix. But that’s unfortunately how this shit goes sometimes, where the winner isn’t necessarily the best offering.

      Hopefully we’ll get to see comparisons in a game that supports both of them at some point, so we can actually see it put it up against DLSS.

      • In the wider scheme of things, DLSS isn’t good enough or available wide enough to really mourn its demise, even if it products something slightly better. Besides, if AMD fails, Epic in UE just demoed its upscaling tech which looks pretty good, and I daresay inhouse devs will just license someone’s platform agnostic solution that doesn’t suck if they have to.

  • Only one question: Why Godfall?

    That game was critically and player base panned as one of the worst Playstation exclusives in years. 59% on metacritic, which was compared to Anthem for its failure to deliver a working of substantial game.

    • You work with the partners you’ve got. The Godfall team were working with AMD on the first implementation of raytracing for AMD cards, so it’s natural they’d continue working on future tech like this too. It’s not really about the game, more the developers and the willingness behind it.

    • AMD Sponsored Title that is available on all platforms running on Unreal Engine 4 (which was likely the primary development target for FSR due to its ubiquity in the industry and wide cross platform support).

  • Interesting to see what the impact in the console space is considering neither has the tech to run DLSS, if it can improve performance on these boxes i am sure it will make quite a few people happy.

  • I imagine this is how the Switch will achieve 4K

    But I’m also a tad confused, how is this FidelityFX Super Resolution different from the FidelityFX in things like Cyberpunk, and Mechwarrior 5, and Shadow Of The Tomb Raider, where you lower the resolution, then post processing tries to make sense of the limited pixels, and then sharpens the image?

    • That’s FidelityFX CAS, which is a contrast-aware sharpening algorithm. It’s not an upscaling technique, but post-processing sharpening that applies over the top of the existing rendered image.

      It’s basically meant to counter the bluriness you get from temporal anti-aliasing; even though it has some scaling, it’s not designed to take images from 1080p to 4K, for example.

  • Like it or not AMD’s solution will have to compete with DLSS at its best right now, and at its best DLSS will give a higher framerate with better image sharpness. The example images provided by AMD don’t fill me with confidence when even the “Quality” processed image is noticeably less sharp than the original even if the frame rate is higher.

    Still, if they can get it right enough a GPU-agnostic solution will probably win out like Freesync beat out G-sync.

    • If you think about the most used libraries that games have used over the years, they generally fall in three camps.

      1\ Most widely implemented for targets across multiple platforms (think Speedtree, Havoc, and some aspects of Gameworks).
      2\ Available by default (think UE extensions)
      3\ Paid to implement.

      Faced with Epic’s UE inbuilt solution coming soon, and FSR being agnostic, I can’t really see DLSS gaining much more traction than it already has before dying out in a year or two.

Show more comments

Log in to comment on this story!