Intel’s Making All The Right Moves With Their Alchemist Gaming GPU

Intel’s Making All The Right Moves With Their Alchemist Gaming GPU

Actually rendering games at 4K is for suckers, it seems. Intel has announced in a new interview more details about some of the feature set in its inaugural Alchemist graphics cards, including a version of AI upscaling that works similar to NVIDIA’s excellent deep learning super sampling (DLSS).

Let’s talk about their upscaling offering first. It’ll be called XeSS, and Intel are hoping to use it to boost frame rates in games by as much as a factor of two. A live version of the AI-powered tech was shown off during Intel’s Architecture Day in August using Unreal Engine 5, but it’s not until now that we’ve got more details about how XeSS functions in comparison to Nvidia or AMD’s competing technologies.

DLSS has been one of the most impressive pieces of tech in gaming for years, which is why fans have pushed so hard for AMD to come up with an alternative solution. Intel’s read the tea leaves, too, which is why they’re following more closely in Nvidia’s footsteps while also pledging to make the technology open source.

But first, a quick primer. For those who haven’t been following, AMD and Nvidia have basically tackled the idea of upscaling in two different ways. Nvidia, which pioneered the technology, has been using neural networks and machine learning that’s powered by specialised cores on their RTX line of GPUs. The initial release of this, called DLSS (deep learning super sampling) had to be trained on a game by game basis, however. That was slow and inefficient for mass adoption, so Nvidia built a new implementation that doesn’t rely on game-specific content. Nowadays, DLSS provides a crisper, cleaner image — especially at motion — and some of the technology’s earlier restrictions, like limitations on what resolutions you could run, have been removed.

But DLSS still has a problem: you need a Nvidia GPU. AMD wanted a more open solution, so they built FidelityFX Super Resolution (FSR), an algorithm-based upscaling technology that works with any GPU. FSR doesn’t require specialised AI hardware or neural network training, which can be especially helpful for those with entry-level computers or gaming laptops that are limited to lower resolutions. And by not requiring specialised hardware, there’s a hope that FSR will one day make an appearance in the PlayStation 5 and Xbox Series X, given both consoles are built with the same RDNA 2 architecture that powers AMD’s desktop GPUs.

Of course, there’s a slight catch. FSR’s ability to upscale images, despite not using an advanced neural network, is pretty good. FSR can also get a bigger frame rate bump at its most aggressive settings, but image quality takes a much greater hit than DLSS under the same circumstances. So which one is better really depends on your individual needs and what you can live with.

Of course, FSR’s big problem right now is adoption. DLSS is starting to roll out in new games relatively quickly, thanks to its more generalised algorithm and the release of engine-level plugins in Unity and Unreal that simplify matters massively for developers. Intel’s response, according to XeSS principal engineer Karthik Vaidyanathan, is to make their technology open source while also leveraging their own specialised AI hardware, potentially getting the best of both AMD and Nvidia’s worlds.

Most state-of-the-art game [techniques] use a lot of heuristics, a lot of hand-designed approaches to try and use as many pixels as they can, but try to do it in a way where you don’t end up integrating false information or invalid information, but they don’t work all the time, and therefore you will often see artifacts like ghosting, blurring and these are issues, commonly associated with techniques like TAA, checkerboard rendering. And that’s where neural networks come in because this is almost an ideal problem for neural networks, because theys are very good at detecting complex features and that’s where we can use them to integrate just the right amount of information and when that information is not there, try to detect these complex features and reconstruct them. So that sort of summarizes the technology.

To make sure XeSS is compatible with other vendors, Intel has built their tech around an instruction set called dot product acceleration. That’s currently supported by AMD (since RDNA 2) and Nvidia (since the release of their 10-series Turing cards). And similar to DLSS and FSR’s current implementation, users will be able to pick from a set of resolution and quality settings. It also won’t require game-by-game training, which should make it easier not only for developers but also engine makers like Unity and Unreal.

“XeSS from day one, our objective has to be a generalised technique,” Vaidyanathan said in an interview with WCCFTech. “It’s similar to the other system [DLSS 2.0], but the underlying technology is likely very different. Because when you have two independent groups, trying to solve a problem in their own way they will likely end up with very creative solutions to the problem.”

A SDK for developers that users Intel’s Xe Matrix Extension cores — that’s the AI powered parts of Intel’s upcoming Alchemist GPUs — will be available this month. If developers want to play around with a version of XeSS that relies on dot product acceleration, which is the one that plays with hardware that consumers actually own today, that’ll supposedly be out later this year.

Intel’s currently working with “several partners” to have XeSS implemented, which Vaidyanathan specified included “game developers”. It’s probably a similar set of partners to what AMD and Nvidia started with — engine makers like Unreal, companies like EA with their own massive internal engines like Frostbite, and so on. Intel’s also done a lot of work with Creative Assembly in the last few years — Total War has been used as a highlight for Intel’s mobile gaming chops, so I wouldn’t be surprised if we see Three Kingdoms or Total Warhammer 3 pop up in a future benchmark or tech demo.

Of course, none of this will answer the question that people care about: Whether Intel’s Alchemist gaming GPUs are actually worth buying. There’s growing speculation that, like AMD’s first attempts with their RDNA architecture, Intel will target the low to mid-range market when Alchemist launches next year. In real world terms, that means a competitor to the RTX 3070, RTX 3060 Ti and the Radeon RX 6700 XT, cards that are all priced at or above $1000 in Australia right now. You’d have to imagine Intel would need to price very aggressively to convince any buyers to hop on board. But if the performance and feature set gets close enough, and developers are happy with where XeSS is at, 2022 could be a very interesting year for the gaming landscape.

The Cheapest NBN 1000 Plans

Looking to bump up your internet connection and save a few bucks? Here are the cheapest plans available.

At Kotaku, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.


Leave a Reply

Your email address will not be published. Required fields are marked *