The general consensus was that AMD would get first dibs at consumers in 2016 with their Polaris line of GPUs, but it turns out that might not be the case with a string of reports popping up over the last 24 hours suggesting that NVIDIA might beat them to the punch.
The reports come not long after AMD announced a substantial price cut to the Radeon R9 Nano to $849, and adds an extra touch of spice for anyone who was preparing for a GPU upgrade to go with their early adoption of VR this year.
Techfrag ran with the supposed leak that NVIDIA would use the Pascal architecture for all of their releases in 2016. Furthermore, the first of those releases will come in April. They’ll also be what’s called “TITAN-grade” products, which is understood to mean a Pascal-equivalent replacement for the current TITAN offerings.
The core series of GPUs, meanwhile, will reportedly be shown off in June. That could mean that NVIDIA will use Computex as the stage to show off the successors to the GTX 950/960/970 series, although whether they’ll be made available in Australia at the same time is another matter entirely.
One fun aspect of all this rumour-mongering is the specs. The GP100 chip, which will be the flagship Pascal GPU, will supposedly sport a staggering 32GB of VRAM. It’ll also be using HBM2, rather than the GDDR5X memory which will be used for the core Pascal GPUs.
Just to remind everyone: none of this is confirmed at this stage. The only supporting information provided is some shipping manifests on the Zauba import/export database, and the likelihood of an actual 32GB GPU seems highly doubtful since it would require production to have started on HBM 8GB modules (which I understood hadn’t begun yet).
Apart from COMPUTEX, NVIDIA might also have some news to break at the GPU Technology Conference in the first week of April. The manufacturer’s CEO is scheduled to give the keynote address for the conference, and if there is anything particularly interesting from a technology standpoint it would be likely for Jen-Hsun Huang to reveal it there.
Comments
21 responses to “Rumours: NVIDIA Will Unveil Their First Pascal GPU In April”
I hope this is true, since I’m looking to upgrade this year. My 780Ti is definitely due for replacement (okay, maybe it doesn’t need one so bad, but I want one)! I’ll be really interested to see what the 1080/1080Ti (or whatever they call the flagships) can do. Supposedly Pascal is looking to be a pretty big leap forward thanks to HBM2 and the substantial die shrink.
Ahhh the upgrade bug hits again! I hope the article is incorrect in saying that only the titan level gpus will get the hbm2 memory. That’s going to mean at least 1500 will be asked if you want hbm2 if not more!
The bug has been hitting for me since the 980Ti came out :p But I decided to be “sensible” and wait for Pascal because of the rumoured bigger leap from Maxwell > Pascal than Kepler > Maxwell.
Yeah, I’ve heard mixed reports on the HBM2, but most of the more recent ones suggest Titan-level flagship only. Which is a shame, as price-wise I can justify a 1080Ti level card, but not Titan replacement.
I know the feeling mate! I bit the bullet and bought two 980tis! And I was upgrading from an r9 295×2 which is still very relevant as long as there’s crossfire support. The thing is nvidia is notobviated to truly push the boundaries in terms of performance so you usually only see a 10 percent increase year on year. I hope this year will be different but I doubt it. They know that the cards will sell like hotcakes no matter what …
It’s terrible! What a problem to have, eh? Damn, that’s a nice setup. What made you change from the 295?
Yeah, unfortunately that’s the case when they don’t truly have that much competition. Even as an nvidia fan I do wish something would give them a bit of a kick in the behind to encourage them to push harder.
I’m hoping the massive die shrink, plus the fact Pascal is on a different “tick” to Maxwell in the tick-tock cycle, means we might see a bigger performance jump.
Yeah, more competition would be nice. It’ll be fun to see how this year plays out.
Incidentally, the rumour is that GDDR5X will be used for the core line and HBM2 for the TITAN replacements, which does make a little bit of sense. It might not play out that way, but there’s a logic to it.
GDDR5X uses technology which the manufacturers already understand very well, and while HBM has a lot of promise … well, everything that had a lot of promise over DDR memory ended up losing out in the end.
As a little addendum to what I wrote in the article: Samsung hasn’t started production of 8GB HBM2 chips yet, and that’d be necessary for a 32GB VRAM GPU to appear since you’d need 4x8GB stacks. (Because the DRAM dies are stacked on top of each other, from what I understand. That might not be 100% correct.)
Anyway, and I didn’t mention this above either: the fun part will be what AMD does with HBM and their APUs. That could really throw the book out in terms of what performance you might get out of a sub-$1k PC (if future APUs with HBM2 are similarly priced as they are today) and let’s not forget the mammoth boost it’d give to future consoles.
Fun times for tech.
It’s the same deal with Intel in terms of CPUs. As much as I make fun of AMD CPUs, Intel really need some competition, or they’ll just continue not giving a shit. Why would they care about introducing revolutionary new features when people will buy them anyway?
Yeah, I suppose that would make sense – trial the newer, more expensive and untested tech on a smaller scale in the flagships. Then if that works, roll it out to the rest of the line in the next generation.
Agreed re: the APUs etc. That is one area I have been pretty impressed with what AMD has done. The idea of an APU with an integrated “baby Fury” GPU would make for a very nice SFF or HTPC.
I agree. Intel run their own foundries. AMD spun-off their own foundries as GlobalFoundries due to financial difficulties. Intel is the only major player in the game that designs and manufactures their own chips. The rest of the Foundries are now on par with Intel (for now) which is a good chance for AMD to make a comeback.
I loved the 295 but am was so slow in supporting crossfire that I just gave up on it. The profiles took months to come out and even then you would have to tinker with them to get them to work. Plus I wanted to game at 4k and the 295 just didn’t have enough grunt. I’m very happy with my 980tis and glad I bought them.
Yeah no competition means very little progress but I do like how efficient the nvidia cards are and hopefully pascal will take that even further.
Crossfire and SLI are going the way of the dodo unfortunately. Too many games either don’t support them or don’t work properly with them. Neither company are about to come out and say so because it would mean customers would stop buying multiple instances of their products.
I’m getting rid of one of my GTX 980’s today, which isn’t intended as evidence of the above – merely a coincidence.
If you’re looking for better performance I’d recommend a GTX TItan X (or whatever they end up replacing it with).
Yeah you may be right. I’ve had some issues with fallout 4 and it’s sli profile but I think that’s more to do with the actual optimisation of the game rather than the profile itself. Sometimes my fps will drop to the 30s for no particular reason when walking through Boston. Also gsync doesn’t seem to work with fallout 4 which is a massive drainer. You can always use nvidia inspector to create profiles from the existing set but that’s more tinkering that I just don’t have the time for. Next upgrade will be a single card upgrade but that won’t be happening for a while (unless I can sneak a couple of grand past the mrs)
I think it will be a bigger leap than we’ve seen in the past few years for both GPU manufacturers. From what I recall the GPU market has been on 28nm process for far too long. They are finally jumping into 14nm. Big difference right there.
This is because, the GPU designers have to now follow after the Smart phone SOC designs to get the Foundry attention they need to manufacture in mass scale.
The other major difference is FinFET (little to do with the GPU designers and all about the foundries), which will help with power efficiency. Which in turn, will mean more powerful GPUs in a smaller power envelop.
For these reasons I am very confident in the new GPUs. So far as to think, buying one today is a big mistake. The catch is the new architecture and more importantly who is going to put these new architectures to good use. That’s where I hope DX12 and Vulkan/Mantle can help.
Hope you’re right! I’m exciting to see what the next gen brings even though I won’t be partaking in it. They probably release a flagship titan with 16gb if hbm2 and a 980ti equivalent with 8gb hbm2 which will most likely be enough for 4k at 60fps but we’ll just have to wait and see.
I agree it would be nice but the 780 Ti isn’t down and out yet. I’m running two in SLI but I can play Just Cause 3 at full specs while using just one card. Sure its usage is at 98% but that’s within the operational threshold. Even one ran Fallout 4 very well before the SLI support update was released. Wait till the new generation’s Ti and Classified models are released before letting the upgrade bug get hold of you.
Oh, don’t get me wrong, it’s not like the 780Ti is underpowered, I just want to upgrade, lol. Plus I plan on playing The Division and other new games at >1080P resolutions so I figure it might be a good time to future proof myself.
No way in hell this is true lol. Kotaku? lol.
If your 780ti is due for replacement, my 570 is two steps from being part of a museum collection
lol and I was thinking my 970 was due for replacement
Think I will have to upgrade this year. However, my JetStream 770’s in SLI are still going really strong…. Not sure if it would be worth upgrading the cards when I’m still on an older board with DDR3 RAM though.
My poor old 690 will probably see me through the next couple of years (I hope…)
Would have to upgrade the rest of my PC (older than the 690) to make one of these new cards worthwhile and that’s only likely if I won lotto… ?
Support freesync in your drivers you cheap fcks, really sick of Nvidia being snobs. I aren’t EVER going to support Gsync, what a donkey load that is!
Lol. I agree, there’s no reason for them to not support a feature that comes standard to Display Port. I hope they are going to, but then again this is nVidia we are talking about.