Nvidia CEO Absolutely Lays Into AMD

In the world of CPUs and GPUs, it's not often that you see competitors directly sniping at each other. Sometimes some veiled digs will pop up during the launch of a product or some other event, but by and large that part of the tech industry tends to be pretty civil.

But for whatever reason, Nvidia CEO Jen-Hsun Huang sat down for an interview and threw that playbook out entirely.

In a roundtable interview published by PC World's Gordon Mah Ung, the long-serving Nvidia CEO utterly savaged the recent launch of AMD's 7nm consumer GPU, the Radeon 7.

Rather than simply pointing to the superior performance of the Geforce RTX 2080 and 2080 Ti - particularly the latter, which will go unchallenged as the king of gaming GPUs for the forseeable future - Huang described AMD's new GPU as "lousy" with no new features:

“The performance is lousy and there’s nothing new ... [there’s] no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it.

It's worth noting that AMD has confirmed repeatedly that they are working on ray tracing. The company revealed last year that they had an open source implementation of ray tracing in Radeon Rays, which was previously called AMD FireRays.

Still, it's a moot point right now given so few games using any form of real-time ray tracing anyway. That's likely to change over the next year, by which time the gaming ecosystem will have more games that can take advantage of specialist chipsets and cores on GPUs.

But Huang's low opinion of AMD wasn't limited to the Radeon 7. After suggesting that the new GPU launch was "underwhelming", he went on to describe the AMD FreeSync technology as completely non-functional, even with AMD hardware:

[FreeSync] was never proven to work. As you know, we invented the area of adaptive sync. The truth is most of the FreeSync monitors do not work. They do not even work with AMD’s graphics cards.

While AMD users - and Xbox One gamers using variable refresh rate on certain monitors and TVs - would disagree, Huang's view is firmly rooted in the certification tests Nvidia does when working with monitors. When the company announced that it would begin supporting the VESA Adaptive Sync standard, which is more or less what AMD's FreeSync monitors use (over DisplayPort), it noted that there would be a string of monitors that it deemed "G-Sync Compatible".

If a monitor isn't capable of meeting those requirements - and the company hasn't outlined precisely all of what those requirements are, although some are obvious like making sure there aren't any obvious artifacing or banding issues then, as far as Nvidia is concerned, the variable refresh rate/adaptive sync doesn't work.

"We believe that you have to test it to promise that it works,” Huang added.

Obviously AMD and FreeSync owners would disagree. The technology does work - the VESA standard is proof of that - but the two companies are shooting at different sets of goalposts.

Huang also saved some barbs for Intel, describing their graphics team as "basically AMD's". You can read the interview in its entirety here. It's not the first time the Nvidia CEO has had a dig at competitors before, but he usually does so with less savagery. And while he does have a point about the Radeon 7 lacking forward-facing features like specialist cores for AI and ray-tracing technology, the real question is: does the first generation of 7nm AMD cards need those features, when Nvidia is pushing the future forward, or does it make more sense to add those features in 18 to 24 months?

That wasn't the case today, for whatever reason. The CEO's annoyance isn't likely to last particularly long, though: at the time of writing AMD stock is down by 2.7% to $US20.19 on the NASDAQ, while Nvidia shares climbed by almost 2% to $US142.58.


Comments

    >And if we turn on ray tracing we’ll crush it.
    Well no you won't, because you take a f*ck huge hit to your framerate when you turn it on.
    As for DLSS, while i believe something along those lines might one day be pretty awesome, the actual in game examples of it (FFXV) look pretty questionable so far.
    This isnt to defend the Radeon 7 (as its price point makes it DOA tbh), its just his points are retarded. "Their card sucks, it doesn't have this feature thats used in like 3 games that also decimates your framerate when you use it".

      Well technically if you turn on raytracing the AMD card flat out won't work *. So yeah that'd be crushing. That said, I see RT as a gimmick *at the moment*. Until it gets to the point there is widespread support in games and we have mainstream rather than $2000 cards capable of solid performance it's a no go for me.

      * Yes I know any well written game should drop back to a supported API like plain DX11/12 but that's not the point.

      I'm not really sold on any form of anti-aliasing once you go to 4k resolution. I honestly can't see an appreciable difference between no AA and high levels of AA at 4k (maybe that's my eyesight failing as I get older) so I'd rather just save the GPU performance and not use it anyway.

      I'm not sure about the price point being bad, it's $100 US cheaper than the Nvidia counterpart. That's a noticeable difference for a card that on paper at least appears to offer the same performance (minus RTX). If nothing else it is likely to drive Nvidia's prices down.

        Haha, you make a good point.
        Yeah i agree its clearly gonna be important in the future but i just think they should have waited for the 7nm die shrink so the costs to add the extra RTX hardware wasn't so massive. Because as you say $2000 is insane.

        I get you, though i guess to me it depends on the monitor, as i'd like a 32 inch 4k monitor. But i feel that still needs some antialiasing. Plus DLSS is also about increasing performance/framerate, as it is less 'heavy' than other AA methods. So if they can get the quality of it up i think it would be a valuable addition to PC settings, as there will always be the new cutting edge graphics game where you need some more performance without sacrificing looks too much and DLSS COULD be the way.

        Well, isn't the RTX 2080 $699 for the plain version and $799 for AIBs? so if this is $699 for the plain Radeon 7 the AIB versions will likely be more expensive. So the cost between them will be pretty close.
        Plus then you gotta compare to previous cards, you have been able to get the GTX 1080ti for as cheap (or cheaper) for a year and half now and thats the performance its meeting (and more comparable as neither have RTX). So basically its taken 2 years to match Nvidia (and still at higher power draw no doubt), while Nvidia has just made a newer more powerful high end. Plus its like %50-60 more expensive than Vega 64 but only like 25-30% better according to their own numbers.

          Like I said, I see no difference with AA at 4k (on a 30" monitor) so it's pointless to me. But yeah, if they can use their AI units for something other than DLSS (which is just AA by a different name) then sure that's of value. But first they actually need to do that. Hardware with no supporting software is useless. Interesting, but useless.

          As for the prices, I'm not sure. I just read an article at a different site stating the 2080 MSRP is $799 but looking into it more seems to suggest that's the founders edition and OEM versions may be $699.

          The more detailed report seems to indicate it's noticeably quicker than the 1080, and was quicker than the 2080 in a lot of games. Of course, that's an AMD report and they don't disclose settings (other than they were running at 4k)
          https://www.overclock3d.net/news/gpu_displays/amd_releases_additional_benchmarks_for_their_radeon_vii_gpu/1#.XDdEE-ZG2Yg.twitter

          It's nearly 70% faster than their previous gen in at least one case, but yeah 25-30 in most. Which is still an impression uplift.

      you obviously read that how you want to see it. Huang said if we turn on "DLSS" not Raytracing. Raytracing and DLSS are two very different things. they have already proved DLSS gives massive performance gains.
      raytracing looks pretty but im far more interested in DLSS, the more that tech is adopted each game the better it will get with nvidia deep learning.

        No im reading it exactly how its said. He said seperately for each Raytracing and DLSS 'turn it on and we'll crush it'.
        And yeah exactly like i said, i'm quite interested in how DLSS turns out too as it did increase performance, but right now it looks pretty bad so im hoping that gets improved.

        Huang said if we turn on "DLSS" not Raytracing.
        He said more than that...
        And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it.

    Sounds like having a first generation 7nm die that almost beats their fully mature 16nm die is causing him to be a little touchy.

    Sure, this generation of cards is a little behind NVidia in raw performance, but the advantages they can squeeze out of bringing 7nm to maturity while NVidia can't get a mature 16nm platform they charge $1000 for to work properly should have him very worried.

    All he's done here is prove he's a salty shit lord, maybe he can go learn how not to be a giant anti consumer choad before he starts throwing shade.

    The truth is most of the FreeSync monitors do not work. They do not even work with AMD’s graphics cards.
    That is the kind of statement that can get you in legal trouble.

    I was hoping the new AMD card would be more like $599 instead of $699, but it will be interesting to see where they go with it.

      Fingers crossed the OEMs will offer some better deals.

      And yeah, that's a pretty risky statement. I'd love to see the "truth" he's basing it on. Does he have a bunch of data about how well it works?

        Per an article on here or Gizmodo not long ago, it seems that he does have a bunch of data about how well it works.

        nVidia is implementing support for the VESA Adaptive Sync standard, and apparently finding that a bunch (the majority) of monitors professing to support the standard don't actually support the standard. Presumably, they additionally tested those monitors on AMD GPUs to check if it was nVidia implementation or the monitor at fault. Maybe?

          Yeah I read something similar. Like I said I'd love to see that data.

      Should be noted that this is essentially another Vega Frontier card, more aimed at creators than gamers.

    The performance is lousy and there’s nothing new ... [there’s] no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080
    Does this mean the 2080 has lousy performance?

      No, he's saying "their cutting edge tech barely performs as well as the "not top of the line" current gen tech". ie, it should be doing much better than it is.

        It seems odd to describe Nvidia's 2nd best consumer card as having lousy performance, as it is the one that is the same price as the AMD one.

    This strikes me more that Huang is somewhat scared of what's coming down the pipeline.

    Nvidia got forced out of its lucrative Gsync business because otherwise it wouldn't be able to certify against HDMI 2.1.

    Nvidia's again pushed out of the console space (yes we know theres the Switch, but the Switch won't be defining GPU trends), which will continue to see Nvidia technology only bolted on, rather than core to any game experience. In 6-12 months we will see two new consoles built on AMD Navi GPU's, and corresponding cards for desktop. This is new, as traditionally console GPU's have lagged behind desktop tech, which will invariably benefit AMD for optimisation.

    And lastly, with Intel entering the GPU space, Nvidia is left out in the cold with a lack of x86 ability, and only so much of that can be countered with its investment in RISC.

    Full disclosure - I own a 2080 Ti, but to me this comes across as a temper tantrum rather than anything meaningful

      Yeah. Very good points. And whilst AMD aren't very competitive with desktop solutions, they have their GPUs into an astounding amount of other devices such as consoles, which means some very strong foundations.

      AMD isn't going anywhere. I do wish they could have brought out a more powerful solution though.

    There's no actual realtime raytracing... he is aware that his own cards don't actually raytrace right? Calling a cat a dog, does not make it a dog.

    He is also aware that Nvidia did not invent any sync tech, G-Sync being a wrapper with a couple extra bits around the Freesync standard to make sure you use a g-sync certified monitor.

      Both of the things you said is wrong.

      The RTX cards do perform realtime ray tracing. They're capable of doing an entire scene if you don't mind the render time, but for gaming purposes they're intended to be in hybrid rendering mode, where base scene information is rasterized traditionally, with specular/diffuse/emissive buffers (built through deferred rendering) passed to the RT API to raytrace realtime reflections, shadows and other details where it provides the most visible results.

      Nvidia was the first to market with a frame sync tech beyond vsync. The hardware to support variable VBLANK (among other features) didn't exist before Gsync (Oct 2013), which is exactly why Nvidia developed a custom scaler for the monitor. VESA didn't announce the Adaptive Sync modification to the DisplayPort 1.2a standard until mid 2014, but supporting hardware came later because beyond just the standard update it still requires a hardware scaler, similar to Gsync. Freesync didn't hit the market until 2015. What you described is the opposite of reality, Freesync is the tech based on VESA's adaptive sync with 'a couple extra bits', while Gsync's approach is completely different and technically more effective. Freesync 2's 'extra bits' have actually made that standard proprietary.

        The only problem I have with Gsync is that it's typically far more expensive and proprietary. It is a better solution but when it adds a premium to the monitor cost... I guess I'll go with freesync.

          Just make sure the freesync monitor you buy does actually support the features you want, is all.

    Gsync and Ray Tracing aside, he's not wrong, but I'm sure some AMD fanboys will think this new Radeon 7 is the holy grail GPU their been waiting for...

    If it was $100 cheaper and 100W lower TPU I'd maybe be impressed... but its another thermally pushed GPU from AMD with a rather high price-tag because they MUST go with HBM2 4 stacks due to crap architecture bandwidth requirements.

    NAVI is going to have to be something truly amazing.. but I doubt it.

    @slipoch I don't think you actually know what your talking about mate, but each to his own...

    Last edited 10/01/19 11:16 pm

      I like AMD, but I also like Nvidia, so I guess I'm not a fanboy. But it is nice to see a solid product from AMD. The last couple generations have been underwhelming.

      When you say $100 cheaper, that what? It's price or the nvidia price? At least AMD have maintained parity with their previous generation prices - it's the same RRP as the previous Vega card. There was no massive price jump like Nvidia threw out.

      As for the power usage, I half agree with you. It's not as efficient as the Nvidia cards, but it is rated 50W lower than the previous Vega. So that's an improvement.

        its not the same RRP as the previous Vega. Vega 64 was selling for $699 AUD, this is $699 USD. So when it comes to aus it will be $1099+. So yes they made a massive price jump too, 50-60% price jump for maybe 30% more performance (going by AMDs likely best case testing).

          If your numbers are true, this is pretty much exactly the same figures as the 2080 - 60% price hike for 30-40% performance gain. The RTX cards bring two new major features, while the AMD cards bring only technical changes (fabrication process and HBM).

            Yeah, so they are both garbage is basically what im getting at. Both insane ripoffs, or more accurately have things that are not needed that add cost for basically no benefit to actual gamers. Though yes the RTX line is slightly less garbage as at least they are innovating which has got to count for something.

          Vega 64 never sold for AUD$699 at launch...The RRP was around AUD$850

          Also remember this was pretty much at the height to the bitcoin mining boom where every GPU was insanely high

          MSRP of USD$699 USD is AUD$966.54 and we all know that it will NOT sell at MSRP

            I definitely saw it at like $750, but yeah it went up pretty immediately because of mining (which i'd somewhat leave out of an argument on this because this card too would go up by 50% if mining were still so valuable, plus before and after mining it was at its rightful price).

            Also you are forgetting that you need to add 10% GST onto that $970, so there's $1067. Plus about another %10 "australia tax", which over the last few GPU and CPU releases has been about how much more it has been than the converted and tax added price. That would be $1173, but say a flat $1150. There is 100% no way it will be under $1099 and im betting it will actually be rounded up to $1199.

          Sorry I was looking at the Vega Liquid which was $699.

      I think so too - the AMD Vega GPU's have always struck me as inferior architectures that rely on an advanced fab process and HBM2 memory to compensate, and burn a lot more power as a result. I can understand his contempt even if it is a bit bizarre for him to have a chair-throwing moment over it. Also the digs at Freesync are a lot harder to fathom.

    This guy talks with Trump-level exaggeration. I don't trust his words for a second.

    Huang's obviously being a bit of a dick here, but when he said Freesync doesn't work that's an exaggeration rather than a lie.

    The problem with Freesync at the moment is there's no certification and barely even a minimum feature set before a company can put the name on a monitor. Freesync (itself a shell implementation over VESA's AdaptiveSync) tends to be limited to the monitor's VRR range, which for some monitors can be incredibly narrow (eg. 48-75Hz isn't uncommon). With frame rates outside the VRR range, you still end up getting all the problems of tearing, ghosting and stuttering back again. Other tech like overdrive and LFC (low framerate compensation) are completely optional - manufacturers may or may not decide to implement them, and still carry the 'freesync' label.

    Comparatively, Gsync works over the full refresh capability range of the monitor (zero to max) and its certification process ensures that every Gsync monitor supports the complete feature set.

    It means when you go to a shop to buy a monitor, if it says 'Gsync' on it you know exactly what it's capable of, but if it says 'Freesync' you still have to go digging into the tech specs to find what its VRR range is, and whether it supports the features you want, assuming they've even put the details in there (many don't). A lot of people won't even realise their Freesync monitor doesn't do what they expect it to do, which is why you have weird advice popping up on forums like 'enable vsync to get your freesync monitor to work properly', which is basically defeating the purpose of adaptive sync altogether.

    Just gave me more reason to support amd in my upcoming of build. I'm done with nvidias lies and marketing stunts. Oh and nowadays totally overpriced hot garbage GPUs.

    Last edited 11/01/19 9:55 am

    At the end of the NVIDIA'S pricing is a joke, claiming they make gaming GPU's for gamer's then putting it out of reach, rather then selling lower and by by quantity like AMD, because they're real target is crypto farmer's.
    All they do is use these expo's for fanboy's to hype their over priced hardware using technology, which is barley incorporated into any games.
    The whole reason he's salty is because he knows people will still buy the AMD card being the cheaper and more realistic option, still achieving 4k, because who cares about ray tracing at the moment and why doe's it come at a premium .

Join the discussion!

Trending Stories Right Now