After kicking the Vega can down the road at Computex earlier this year, AMD finally unveiled their Vega series of GPUs at the SIGGRAPH event in Los Angeles. Three cards were unveiled: the Radeon RX Vega 56, an air cooled GPU available for $US399, as well as air cooled and water cooled iterations of the Radeon RX Vega 64.
The Vega GPUs were always targeted at users looking to purchase a GTX 1070 or GTX 1080, and the released specifications for the Vega 56 and Vega 64 match alongside that. The performance target comes at a heavy cost though: the lowest power draw of the Vega GPUs is 210W, while the liquid cooled Vega 64 draws up to 345W alone, more than the R9 390X or the R9 290X (both of which used around 250W-290W of power in third-party real world testing). The R9 Fury X also drew around 275W under regular circumstances, and was manufactured with the older smaller 28nm process.
The liquid cooled Vega 64 comes with a supplied pump and 120mm radiator, so you won’t have to fork out for extra kit. It’s worth noting that the baseline Vega card is the air cooled model, however, featuring a blower-style setup not too dissimilar from NVIDIA’s Founders Edition cards.
The Radeon RX 64 starts from $US499. The cards will go on sale internationally from August 15 local time, and we’ll confirm local pricing and availability as soon as we can.
From the raw hardware, here’s how a GTX 1080 and GTX 1080 Ti compare to the Vega 64. Prices for the GTX cards are taken from the cheapest available on StaticICE right now, although your mileage may vary depending on your preferred retailer.
Geforce GTX 1080 | Geforce GTX 1080 Ti | RX Vega 64 (Liquid Cooled) | RX Vega 64 (Air Cooled) | |
---|---|---|---|---|
TFLOPs | 9 | 11.3 | 13.7 | 12.66 |
CUDA/Stream Processors | 2560 (CUDA) | 3584 (CUDA) | 3584 (SP) | 4096 (SP) |
Core GPU Clock Speed | 1607MHz | 1481MHz | 1406MHz | 1247MHz |
Boost GPU Clock Speed | 1733Mhz | 1582MHz | 1677MHz | 1546MHz |
Memory Speed | 10Gbps | 11Gbps | 1.89Gbps | 1.89Gbps |
Memory Bandwidth | 320 GB/s | 484 GB/s | 484 GB/s | 484 GB/s |
RAM | 8GB GDDR5X | 11GB GDDR5X | 8GB HBM2 | 8GB HBM2 |
Power (W) | 180W | 250W | 345W | 295W |
Price | From $799 | From $1099 | TBA | TBA |
The Vega cards should be in similar territory to their NVIDIA brethren, at least in theory, although it’s worth remembering that the GTX 1080 has been out for a full year and AIB versions of the GTX 1080 Ti started landing in the hands of press around late March-early April.
Third party benchmarks haven’t hit the internet yet, and aren’t expected to until closer to launch. Initial indications place the Vega 64 GPU around the same territory as the GTX 1080, according to 3DMark listings spotted by Videocardz, although the core/boost clock speeds indicate that the Vega cards may not have been running at full performance.
AMD is making another big push with their Vega cards, by bundling them in three separate Radeon packs, termed Radeon Red (with the Vega 56), Radeon Black (the air cooled Vega 64) and Radeon Aqua (Vega 64 water cooled). “Radeon Packs include a $US200 discount on the 34″ Samsung CF791 curved ultrawide FreeSync monitor, and a $US100 discount on select Ryzen(tm) 7 1800X processor and 370X motherboard combos – $US300 in combined hardware savings,” AMD said in a release. Each Radeon pack in Australia will also come with codes for Prey and Wolfenstein 2: The New Colossus, while those in Switzerland, Germany and Austria will get Sniper Elite 4 instead of Wolfenstein 2.
Comments
27 responses to “AMD Unveils RX Vega, Starts From $US399”
I know looks don’t really matter much, but man…I really don’t like that shield design.
Also, holy crap it’s power hungry. Even the 1080Ti was surprisingly high, 295W is crazy for a single card.
I love the design, so there you go 🙂
I generally like minimalist design, but not in this case. It reminds me of the Futurama episode where Leela got them a ‘safer’ ship that looked like a featureless minivan because it was safer. Give me the Professor’s hotrod ship any day!
Sweated for a bit seeing as I got a 1080ti last week
The 1080Ti is an amazing performer, I wouldn’t worry at all. Its performance jump over the base 1080 is unprecedented.
Certainly a bit disappointing but they could still be good depending on how this translates into AUD.
I think RX Vega 56 at $399 is the better one though, i will make a few arguments why below:
$399 US right now is $500 AUD so $550 after tax, so maybe a $599 high quality aftermarket version? GTX 1070s were around $600 before the mining boom, and this vega theoretically should be ~10-15% better. (going by RX Vega 64 being about equal with a GTX 1080, and the RX vega 56 having a bit over 10% less shaders, so say its 15% worse than a GTX 1080 and a GTX 1080 is around 25-30% better than a 1070)
Plus there’s driver improvements, polaris got like 8-10% better in like 6 months, i think with the architecture changes Vega has that could easily happen again (or more) as they improve drivers for it.
next there is freesync, say you are getting a card around $600 and a monitor around $300-400, there are many many freesync monitors in thbat range and absolutely no gsync ones so that is certainly a plus towards going with the AMD GPU with freesync monitor.
Lastly and this is the most speculation-y. How often has AMD had cut down chips that could be unlocked to full versions (ill answer, many times, r9 390s to 390Xs, rx 480 4GBs unlocked to 8GBs and i think even some fury cards to fury Xs setting a precedent for HBM cards, and so on). Im very tempted to get one on the chance this does happen. I got a GTX 1080 for $680 that should arrive in the next couple days but i wouldnt mind going for one of these Vega cards as i have a freesync monitor, hmmm decisions.
Just wanted to comment specifically on the FreeSync part:
Keep in mind, GSync and FreeSync aren’t the same thing. GSync is both an actual module and a certification process, FreeSync is just a VESA AdaptiveSync implementation with a few extra features, and sadly all the extras are considered optional – a monitor can advertise as FreeSync with only the bare minimum AdaptiveSync in a narrow VRR range.
Part of the reason GSync is more expensive is it’s the superior technology combined with a certification process to ensure minimum ghosting, something the cheaper FreeSync monitors tend to suffer from. The other major difference is GSync works across the full refresh range of the monitor (0-max), while anything based on AdaptiveSync only works within the VRR range of the monitor, which can be very narrow (48-75Hz is the most common) and suffers from stutter and tearing outside that range.
TLDR: FreeSync monitors tend to be cheaper but are often considerably inferior. GSync costs more but maintains higher standards on technology that does a better job of ensuring zero tearing across the full capabilities of the monitor.
Having just bought a cheap FreeSync enabled ultrawide LG monitor, you can get really good ones! I paid $299 for this: http://www.lg.com/us/monitors/lg-29UM68-P-ultrawide-monitor and it’s a great piece of kit. The FreeSync makes a HUGE difference, but I disagree that a range of 48-75Hz is narrow. If the game is jumping around nearly 30fps, you just need to lower the settings. I aim for 75fps but if you then get any dips you really don’t notice them. While technically it may not be as good as G-Sync, if you’re on a budget FreeSync can be amazing.
Sorry if my post wasn’t clear. There are definitely good FreeSync monitors out there, the problem is that it’s not as assured as it is with GSync monitors. For example, Low Framerate Compensation (LFC) requires a max VRR rate 2.5 times higher than the min VRR rate, which your average FreeSync monitor can’t handle, so the feature isn’t implemented.
With respect to 48-75Hz, it’s not an issue with frame rate fluctuation but with pure output. When things like FreeSync/GSync are enabled you disable vsync, which uncaps the game frame rate. If you get higher than 75fps or lower than 48fps on a 48-75Hz VRR monitor you lose the benefit of adaptive sync, ie. the game frame rate and the monitor refresh rate are no longer synchronised and you begin to get screen tearing. You can artificially cap the upper rate to match your monitor manually with FRTC in the AMD driver but it’s more of a bandaid than a fix. Unfortunately there’s no way to fix tearing and stutter outside the lower VRR bound.
I wasn’t trying to imply that FreeSync is garbage. Both technologies are better than vsync by a long way. What I was trying to convey was that they’re not both set features that do the same thing and are only distinguished by price, FreeSync can dip into some really crap implementations that you have to research to find out, and manufacturers are still reluctant to publish VRR figures for their monitors which makes that research a lot harder. GSync is more expensive, but offers some peace of mind by virtue of stricter standards and a more stringent certification program.
Good points, although you do still enable V-Sync while using FreeSync. It seems odd to me but AMD have confirmed you should. I have FRTC disabled, but FreeSync on and V-Sync forced on. This way you don’t go over 75 and if it does fall under 45 in snaps down to 30 with V-Sync. I also have their new Enhanced Sync switched on, although I’m sure on the merits of that yet.
You are right of course, FreeSync allows for some shitty implementations which should never happen with G-Sync.
Sadly this just isn’t good enough these days. 75Hz?
I play Dota2 at over 120FPS and you can definitely notice the 144hz monitor different of my ASUS PG279Q monitor (with gysync). Counterstrike is again, very noticeable.
I’m sure you can tell the difference but not everyone needs 144hz monitor. In most games I’m happy with a steady 30fps! But then again I don’t any competitive games, but I imagine 120fps in a Counterstrike is pretty nice.
That’s why AMD made Freesync 2, it’s basically Freesync except the monitors have to follow a set standard by AMD which can compete with G-sync pretty well. However, we have yet to see any Freesync 2 displays.
Yep, FreeSync 2 is an improvement, though it goes into closed territory rather than open standard. The other problem is AMD have steadfastly refused to rule out charging licencing fees for FreeSync 2 despite reporters asking, which is worrying considering the whole point of FreeSync is it’s meant to be free.
Very true. but even then it doesnt add cost to the monitor so even just the basics of freesync is going to be an improvment. Also, its partly why i was saying monitors $3-400 (maybe should have been up to $500) as they usually are ones that have pretty good implementations of freesync but still none with gsync.
Definitely an improvement over vsync, I’m with you there. You can get sub-$500 GSync monitors though, I’m almost certain, as I know there are a few models that sell for around $350 USD.
For me personally, I tend to shop infrequently but get the top end when I do. I went with an Asus 34″ ultrawide GSync monitor up to 100Hz. Acer also has an excellent 34″ Predator model with very similar specs. They’re both quite expensive, but it’s more the 34″ 100Hz combo (which is quite difficult to drive) that contributes to the cost than the GSync processor.
You are right, there are a couple g-sync monitors for around $520 (though i cant say i’d want to spend that on a 1080p TN panel even if it is 144hz), plus you can get the equivalent freesync monitor for like $300 (or $249 sometimes kogan has one). so thats about $200 due to g-sync and it seems about the same for 1440p panels with freesync ones being probably around $500 and gsync $700.
Nice call on that monitor, it should last you many years and still look great. I just got a 29″ 1080p ultrawide 75hz freesync so i think my GTX 1080 is way overkill but the monitor was only $300 and was originally going with a R9 390 but through some shenanigans the 1080 has only really cost me $460 (gotta love miners paying twice retail for a card, lol. You served me well 390)
For a second there I thought you guys were quoting USD for the NVIDIA cards, was going to say someone is being RIPPED off. I expect NVIDIA to do a price drop soon on those, also you can get them for $600aud if you bypass all the Australia rip off middle men.
Man those things are hungry for power. And heat. I hope they do well though. Without AMD NVidia will just charge whatever they like.
GTX 1070 MSRP is $349 and GTX 1080 MSRP is $499. Vega MSRP is not better, discount from package is only applicable to the most expensive Ryzen 7 CPUs, the 3 most expensive X370 mobo and the single most expensive (with problem on ultimate engine FreeSync) freesync monitor. It’s a shame, because even cheaper Korean UW monitor like Crossover at half the Samsung price can manage 48-95 FreeSync range fine but that specific Samsung model has problem with 48-100 FreeSync range. Just google up the model.
Those aren’t the MSRP’s in Australia, though.
new vega card eh?
i just got a GTX 1080, AU$899. did i get shafted?
No, I don’t think so. You’ve got a good card, it’s still a top performer (on par or slightly better than Vega) and the price is reasonable given how nuts demand for GPUs has been over the last few months. You’ll be happy with the performance.
cool, thanks.
Only 1.89Gbps…? is that a typo?
No it’s due to the wider 2048-bit bus. The bandwidth is the same though.
As Astrix said above, no typo.
Yeah it will be roughly around ~450MHz my Fury X’s memory is clocked at 500MHz.