Here we are again, with AMD threatening to disrupt the PC hardware market. Their first-gen Ryzen launch certainly shook up the CPU market, but a few years on, the industry is still waiting for AMD to do the same with their graphics cards.
With the release of the 7nm Radeon 5700 and 5700 XT GPUs, we're starting to see a window into a future where AMD might genuinely compete with Nvidia.
The Radeon 5700 and 5700 XT aren't the first 7nm GPUs AMD has released on the market — the Radeon VII earlier this year was targeted at VFX artists, video editors and other content creators in need of 16GB of memory, while the Radeon Instinct MI60 Compute GPU was designed for researchers, scientists and academic workloads.
So for a lot of people, the Radeon 5700 and 5700 XT is really the first 7nm graphics card worth paying attention to. And it's an interesting card, although it's far from the world beater some hoped it might be.
So they're finally out: AMD's answer to the latest generation of hardware from Intel and Nvidia. But how much will they cost?
So far, manufacturers have maintained the MSRP set by AMD for the Radeon 5700 line. The 5700 XT which I tested is available from $629 from PLE Computers and PC Case Gear, but other models from Sapphire, MSI, ASUS, Gigabyte and other brands retail anywhere from $639 to $699.
That's compared to the new RTX Super cards, which are going for way more. Prices for the RTX 2070 Super start from $859, matching the MSRP set by Nvidia, but many local retailers are selling cards for $900 and more, with some RTX 2070 boards pushing closer to $1000 and even beyond.
The RTX 2060 Super is more expensive than the Radeon 5700 and 5700 XT as well, with the cheapest models selling for $689 and many major Australian retailers selling brand-name AIB models from $729 up to $849. And that's really an important factor in how to frame and think about the Radeon 5700 line, especially once we get into the numbers.
News had leaked out prior to the intended announcement, but in an embargoed briefing on Tuesday afternoon, NVIDIA finally unveiled the next prong in their RTX generation of GPUs: Super.
Radeon 5700 XT Specifications
Here's the card's raw numbers:
|AMD Radeon 5700 XT 7nm GPU|
|COMPUTE UNITS||40 CUs|
Small note about the boost clock here. AMD has changed the naming conventions of how people normally understand clock speeds. The boost clock used to be what a card would ordinarily hit at full load during, say, a video game. GPUs always had higher peak clock speeds than that, but you don't really count something that you might hit for one or two seconds, if that.
What AMD has done is introduce Game Clock, which is the "minimum expected GPU clock when running gaming applications" and basically the equivalent to what everyone used to just call Boost Clock. So if you're looking to compare speeds between the cards, consider Nvidia's boost clock (or, what everyone had previously accepted as boost clocks up until now) the same as "Game Clock".
On the power side of things, it's worth calling out some extra features designed to reduce power draw. Chill Mode is an option in the Radeon Settings that maintains a game's frame rate within a predetermined range. Since the release of the latest AMD software suite earlier this month, Radeon Chill's range defaults to one-half of the monitor's maximum refresh rate and the maximum refresh rate (so if you were playing on a 144Hz screen, the Chill range will be between 72 FPS and 144 FPS). The feature itself isn't turned on by default, but it's a nice power saving consideration, even if most gamers never think about it.
The drivers with the new Navi cards also come Radeon Anti-Lag (RAL). RAL isn't exclusive to the Radeon 5700 or 5700 XT — it's a feature that's compatible with older GCN-based AMD cards and even their APUs. There's a slight caveat in that only the RX 5700 cards can use RAL in DirectX 9 games, and the feature doesn't work at all with DirectX 12 or Vulkan titles. AMD also recommended that reviewers run tests in exclusive fullscreen mode, rather than borderless fullscreen/borderless windowed.
Another feature worth mentioning is Radeon Image Sharpening, AMD's post-processing sharpening filter. It's not powered by machine learning the same way Nvidia's DLSS is — the 5700 cards don't have any equivalent features to power machine learning anyway — but as it's just a post-processing filter applied over the screen, it's usable across any game or any resolution. Well, in theory.
The kicker with RIS is that it's only usable in DX12, DX9 and Vulkan games, although the company has told multiple outlets that support for DX11 will be added in a future software update if there's significant demand. Given that there's more DX11 games, and the older DX11 games are ones that people are more likely to run at higher or downsampled resolutions (because they're older and therefore not as hardware intensive as, say, Assassin's Creed Odyssey at 4K), hopefully this is something AMD patches in the coming months. RIS is also a global setting: it's either on for all games, or it's off, and you can't toggle it on or off mid-game without restarting.
But because of time pressures — something much of the tech community has just gone through with the release of three new Nvidia cards, the third generation Ryzen CPUs and two new 7nm GPUs on top of that, plus all the other news that interrupts the daily flow like the Switch Lite and a refreshed Switch — I haven't had time to fully test out all of the 5700 XT's features.
Radeon 5700 XT Benchmarks
The Radeon 5700 XT is a play by AMD for the mid-range market, people who would consider spending around $600 on a GPU but would completely baulk at $1000 or more. With that in mind, it should be no surprise that the 5700 XT excels at 1080p and 1440p resolutions, and isn't really suitable at 4K without some serious sacrifices.
There are outliers, of course. The superbly optimised Forza Horizon 4 is one case where the 5700 XT could cope quite well at 4K, and if you don't mind slightly older games I'd expect 2016's DOOM to run like a dream.
I also ran some tests against the current setup with the RTX 2080 Ti, Nvidia's current gaming flagship and still the best option for anyone looking to game in 4K. It's not a direct like for like comparison, but we don't have access to an RTX 2060 or the recently-released RTX 2060 Super which the 5700 XT is better compared against, so keep that in mind. All tests were also run several times, with the exception of Forza Horizon 4 which I ran and re-ran over multiple days just to be sure.
- CPU: Intel i7-7900X (stock speeds)
- RAM: 32GB DDR4 3200MHz G-Skill TridentZ RGB RAM
- GPUs: Radeon 5700 XT / RTX 2080 Ti Founders Edition (stock speeds)
- Motherboard: Gigabyte AORUS Gaming 7
- Monitors: Acer X27 4K HDR 144Hz / EIZO 23.5" 240Hz VA monitors
- PSU: EVGA Supernova G2 850W
- Drivers: Nvidia 431.36 / Radeon Adrenaline 19.7.2
For clarity: the 7900X is running on stock clock speeds on a Corsair H100i liquid cooler, while the RAM is running at 14-14-14-34-1.35V (confirmed with CPU-Z). G-SYNC was disabled for all tests, and the GPU was set to maximum performance in both the Nvidia Control Panel and the Radeon Settings.
For testing, the 2080 Ti was tested using Nvidia's 431.36 drivers from July 9. Tests were initially run with the 19.7.1 Radeon Adrenaline drivers released on July 7, but I re-ran all the tests following the release of the 19.7.2 drivers this week. Dynamic optimisation was also disabled in games where it's an option (Shadow of War and Forza Horizon 4), as was V-Sync. G-SYNC was also disabled for all the RTX 2080 Ti tests, and none of the Radeon Anti-Lag/Image Sharpening/ReLive features were enabled for the 5700 XT tests.
So let's get the real obvious result out of the way.
Everyone knows that Forza Horizon 4 is well optimised, but I don't think it's until the last couple of weeks that people realise just how well optimised it is on AMD hardware. Some of that was a given — FH4 was optimised specifically for consoles and the GCN architecture within the Xbox One, so it should logically run even better with the newer, more efficient RDNA architecture underpinning the 5700 cards.
Watching the tests closely, you can see it in action. The AMD card is more efficiently utilised throughout, while the RTX 2080 Ti consequently drops in usage. It's not because the RTX 2080 Ti isn't as powerful. It's gotten a little better over time, going off the initial testing I did last year when the RTX cards first launched.
And this is behaviour that's been documented elsewhere. But because everyone's testing suite and choice of games is different, and the fact that all of these cards have launched in such a small space of time, that's probably why the behaviour probably hasn't been as publicised.
But this is an exceedingly rare win for the 5700 XT, and by rare I mean the literal best case scenario. Outside of extremely well optimised games, the 5700 XT isn't really a reliable 4K card and performance falls back to more normal levels when you move towards DX11 games.
But hey! If you're only spending $629 or a bit more for a better cooler — and you absolutely should — the results hold up. The 1440p figures aren't hovering around the 100 FPS mark as much as I would like. But it's hard to get much better for the price your paying. The older non-Super Nvidia cards still cost a fortune. The non-Super RTX 2070 is still selling for $729 or more at most local PC stores, while the faster RTX 2070 Super is going for around $900 (although some places have Galax-branded models available for around $870).
The main question to answer here, beyond the performance you're getting and at what price, is whether the 5700 XT delivers as a 1080p and 1440p card. It does, but if you were hoping for a solid amount of overhead at max settings at 1440p, you might want to drop settings down from the highest preset just to be safe.
The User Experience
Part of buying an expensive graphics card is the user experience, which is to say, it should be seamless. The 5700 XT doesn't cost that much compared to the RTX 2080 or 2080 Ti, but it's still a huge outlay for people building their first PC, or when compared to the cost of a lot of normal components.
So it's normal for people to expect stuff to just, well, work. Similarly, it's also quite normal to expect a brand new AMD product to be a bit buggy out of the gate... and the 5700 XT certainly suffers from that.
Overclocking is pretty much broken as it stands. I encountered so many errors with the Radeon Wattman software that, even using the automated presets for overclocking, the only option in the settings that consistently worked was the option to boost the board's power up to 50 percent. Even then, I'd run into occasional crashes during 3D Mark tests or during game benchmarks, which doesn't really inspire a lot of confidence given the GPU is designed largely for both those things.
It's an issue that's been widely reported, and AMD are investigating. I'd add that there were similar quirks with Wattman when AMD first introduced it with the RX 400 series cards a few years back, and their initial Ryzen CPUs also had a few odds and ends that took a few months of updates and BIOS flashes to fully sort out. (Take their latest CPUs. The Windows 1903 update is a pain in the arse and has caused problems for all sorts of users, but it also includes a key update to the Windows scheduler that's particularly pertinent to the performance of the new Ryzen systems, so you just have to cross your fingers and hope everything works, basically.)
And then there's the card and board itself.
What you're looking at here is the default fan curve for the 5700 XT when you first install the card. The bottom axis is the temperature of the card, and the curve dictates how fast the GPU's fan should run at different temperatures. By default, the fan only runs at 40 percent of its maximum speed... if the GPU is running at 100 degrees.
Which, obviously, is insane. Of course, you can't really run the fans too much higher than that, because they are freaking loud. I'm talking unbearably noisy, loud to the point where I could hear the fans from one end of my apartment to the next.
The card itself will operate just fine on the default fan profile, but a better cooler setup would have allowed the fans to be pushed harder — meaning lower temperatures, and more performance for you.
Fortunately, that's a problem third-party boards (read: boards sold by MSI, ASUS, Gigabyte and so on instead of the "Founder's Edition" or reference models like this one from AMD) often solve. And if you remember one thing from this story, make sure it's this: don't buy a reference board. Wait for models with aftermarket coolers.
AMD promised the future of GPUs at their Computex launch this year, but the 5700 and 5700 XT aren't quite there yet. Neither card is a true gaming flagship in the way the market currently understands. PCs often talk about being on the cutting edge of graphics tech, pushing gaming and computers further and further. But here we are with two mid-level cards that cost more than the current-gen consoles. Neither is touted as a viable option for playing at 4K, and we're only a year away from having gaming hardware that's capable of pushing out 8K content.
The messaging is a little strange, when you look at the wider picture.
But that aside, and the inevitable annoyance of dealing with buggy launches, the 5700 XT is a really interesting card. It's a genuine competitor at a price point that Nvidia can't really match right now, not unless they want to take a substantial hit to their margins. It'll be especially interesting when more AIB offers land in Australia. All the options on the market now use the same blower design as the reference 5700 XT, which wasn't great when Nvidia used it on their Founder Edition cards, and it's no better here.
And if you wait a bit longer, you'll get the benefit of more stable, more optimised drivers.
I've put up this picture of the old AMD RX 480 because the 5700 XT reminds me a little of what people were hoping for when those cards launched. There was a lot of talk about superior optimisations in DX12 games, support for Vulkan, more efficiency in certain scenarios.
And to be sure, there was plenty of that. But the RX series remained a very entry-level, value for money offer. If you wanted to push your performance, you were still driven towards the Nvidia product stack, particularly in the Australian market where supply of AMD cards was very thin.
That might still be the case in the next few weeks, as retailers try to fulfil their pre-orders. But the 5700 XT shows a little more of that promise that was left unfulfilled. It's not just at a genuinely competitive price; it's an attractive purchase for anyone building a sub-$1500 or $2000 rig today. I'd still argue (again) that you shouldn't buy the reference card, however.
But the board itself is fine. And it makes me wonder about two things. As we move towards an 8K future and a second round of consoles built on AMD hardware — and games that are increasingly optimised for that hardware — what will that truly mean for the Navi line in 2020, 2021 and 2022?
And most of all, this is still a mid-range card.
What would a genuine AMD GPU rival — a refined, full-size Navi chip, one with ray-tracing support, AI, ramped up video encoding and better thermal design, plus the other features that you'd expect from a top-of-the-line flagship — actually look like?
My guess? We'll find out next year. For most people, the 5700 XT will do just nicely.