Pre-2010, the antics of NVIDIA and AMD (or ATI back then) was one of PC gaming’s biggest topics, but in recent years the “rivalry”, as it were, largely went off the boil. And then AMD released Mantle, a lean 3D graphics API and competitor to Direct3D and OpenGL and suddenly, it was on again. Now the two companies exchange barbs on a regular basis, with NVIDIA providing the latest salvo.
Maximum PC managed to get NVIDIA engineers Tom Petersen and Rev Lebaradian into a room for almost two hours to quiz them on a range of topics, including their opinion on Mantle.
One of the first issues tackled is the supposed contractual obligations of developers who use NVIDIA’s GameWorks that stops them from tuning their titles for AMD hardware.
Lebaradian is very clear on the matter:
That’s just false … it wouldn’t be a reasonable action for a game developer to do, right? There’s nothing about the contract that NVIDIA does that prevents a developer from optimising for anybody. Nobody would agree to that kind of stuff.
He also clears up confusion surrounding what Mantle provides — or specifically, what it doesn’t:
I think there’s a fundamental difference here. With Mantle, it’s not really doing anything that you couldn’t do before. Maybe you’ll get some more performance in certain segments, but fundamentally it’s not actually adding any new features to a game.
DSO Gaming’s John Papadopoulos pulled out a few quotes from the interview, including what involvement NVIDIA has, or will have, with Mantle:
We don’t know much about Mantle, we are not part of Mantle. And clearly if they see value there they should go for it. And if they can convince game developers to go for it, go for it. It’s not an NVIDIA thing. The key thing is to develop great technologies that deliver benefits to gamers. Now in the case of Mantle it’s not so clear to me that there is a lot of obvious benefits there.
There’s a bit of marketing spin in there, as you’d expect, and I find it interesting NVIDIA hasn’t done at least some research on what is arguably its competitor’s most significant software endeavour ever. The pair defends this position by declaring that the refresh of the Direct3D API in DirectX 12 will offer “essentially” the same performance gains:
It’s possible to pull performance out of DirectX, we’re approving that, and so you can argue that maybe it’s not a good idea to put extra effort into yet another API that does the same thing essentially. Feature wise there is nothing more … DX12 is coming and a lot of the features, the benefits of having a lower level API (the extra calls and stuff), it’s going to be in DX12.
If you’d like to watch the chat in full, put aside an hour and 40 minutes and hit the video below.
NVIDIA Finally Officially Speaks About AMD’s Mantle — Will Not Support It, No Real Benefit Using It [Maximum PC, via DSO Gaming]
Comments
35 responses to “NVIDIA Has No Interest In Mantle, Doesn’t Do ‘Anything You Couldn’t Do Before’”
IDK if I agree with the whole ‘it’s not adding anything’.
I’d take some extra performance over some gimmicky technology that only a handful of games support, like PhysX, any day.
That said I love PhysX and would like to have both PhysX and Mantle from one company. If they don’t want to support Mantle, I feel we, the consumers, are the only ones who are missing out.
For now I have ATI, when I upgrade (Soon) I’m going nVidia. Why? Because I jump around from team to team every time I upgrade for variety of features etc. (My last card was a 550ti, now a 7870, looking at maybe getting a 780 now or holding out for an 880) But if one side could include more than just their own offerings I don’t think I’d ever swap teams again. (Until something truly ground breaking happened exclusively to one platform, which normally isn’t that often… or at all.)
Again I’m just saying, I’d love PhysX AND Mantle, and I don’t think I’m the only one.
Smart guy. Brand loyalty is ultimately worse for the consumer. I have switched between brands since I got my first card in 1999. Check out this monster
http://en.wikipedia.org/wiki/RIVA_TNT2#mediaviewer/File:NVidia_RIVA_TNT2_%28ca._2001%29.jpg
Just beautiful. I had a voodoo 2 first, then I upgraded to one of these.
TNT2 was *the shit* back in the day man 😀
Sure was! Quake 2 in software mode was for chumps.
And still a better shooter than most modern ones.
If their comments on DX12 performance are accurate then Mantle really doesn’t bring anything new to the table, but does risk fragmenting the market between two incompatible APIs.
All the more reason nVidia could back it.
Considering that Mantle is, presumably, closer to what they can use in the PS4/ xbox 1 and works/ will work on almost everything, unlike DX.
I mean, I know this is a pipe dream scenario, but listen to this:
AMD already support it and (supposedly) soon Intel will it too for it’s integrated chips… nVidia will be the odd one out. Not to mention that Mantle is said to be making it’s way to Linux (And presumable OSX) that means that Mantle will be in the prime spot for ease of development and compatibility if you are working on multiple platforms, which may also open the market in terms of Linux and Mac ports of games, AND bridge the gap on the consoles because they won’t be split between DX and another API. (Not to mention that it is already a little more optimised than DX, even though it’s still a baby in comparison.)
Not bashing DX, but they are just playing catch up. I really wish that this would be the one thing nVidia did back from AMD, it actually makes a lot of sense. IMHO if they don’t they may well create a gap, or as you say, fragment the market. Which would only harm us, the consumers.
All of that is just my point of view and I doubt it would be that black & white, but still. It’s a competent and, in the future, a widely supported API. Unless DX12 is completely revolutionary… which I doubt. It’ll be good, but as I say, they are just catching up I think.
There is no Mantle support on either console at the moment, it requires system support in addition to the hardware. Sony might be open to supporting it though they have no reason to, since their own API offers the same low level access as Mantle anyway, and Microsoft won’t support it on the X1 when they can just improve DirectX instead.
There are other issues with the way Mantle does things that I addressed in a previous post on the subject, but basically the philosophy of Mantle is the opposite of DirectX. Mantle grants low level access but has weak abstraction and developers have to write code to detect and use particular hardware features manually, while DirectX’s entire purpose is hardware abstraction such that developers can write code once and it will run on most hardware without special handling. Most game developers don’t want to return to the days of writing vendor modules in their games because it made development a lot slower and a lot harder to provide universal support.
Point is.. It’s an upcoming Api that looks like it will have wider suppirt than Dx… But you don’t want Nvidia to support it?
I don’t want developers to have to go back to writing vendor modules just to have wide GPU support in games. Good abstraction is necessary for modern PC development.
I guess we can agree to disagree. I see the best future of PC gaming being open, not restricted by platform API’s.
Indeed it should be, but not at the cost of making developers have to write code to handle different vendor hardware. Given the choice of writing for 2 platforms vs writing for 50 graphics hardware profiles, as a programmer myself I can tell you which one I’d much rather do.
As opposed to making a DX version, OpenGL version, PS4 version and whatever else?
I know my ‘view’ is a pipe dream, but I do think it’s the best way to go. Maybe not with Mantle, but something like it. Not supporting new tech just because your competitors ‘created’ it is kinda childish. Again IMHO.
I’ve explained that there are reasons other than ‘because a competitor created it’ to not support Mantle, It’s a flawed design with respect to broad hardware support.
It’s new of course it has issues.
It’s not because it’s new, it’s a problem with the design philosophy of Mantle. It gains some performance at the cost of abstraction. DirectX gains abstraction at the cost of some performance.
I don’t know how long you’ve been gaming but maybe you don’t remember the days of games having to specifically support hardware profiles like Gravis Ultrasound or ‘Soundblaster-compatible’ and others. Or libraries like UniVBE that were early attempts to provide abstraction across a bunch of different display devices so the developer didn’t have to support them all themselves. Abstraction is essential to modern development, games would never have become as advanced as they have without it.
Mantle loses abstraction for performance, which is great for closed systems like consoles but is awful for open systems like PC. If Mantle’s design philosophy changes in that respect then I’ll give it reconsideration, but as long as that is an essential part of its philosophy, I don’t believe it will (or at least should) surpass APIs that put abstraction first.
You assume a lot of me just because you do not agree with my opinion. I just want to point out that I have not assumed anything of you.
Further more, I have never said Mantle is perfect, or a great solution, just that I think it is unwise not to include a new upcoming API that will be as widely available as Mantle looks to be. I’m sorry that you can not see past that.
As I said earlier, lets just agree to disagree. Not that I even disagree with what you say. Perhaps if I say it again, rephrased to be more precise:
This is not about the benefits and drawbacks of Mantle vs DX (or any other API for that matter), this is about supporting it or not. And those are two COMPLETELY different topics and I don’t need any programming experience to see that. (Which just by the way, I do have some. Food for thought.)
I haven’t assumed anything of you. I think “I don’t know how long you’ve been gaming” and “maybe” are pretty clear indicators of that.
It sounds to me like what you want is OpenGL, not Mantle. OpenGL is more widely supported, has abstraction at a competitive level with DirectX and is only absent on console platforms. In comparison, Mantle has almost no current support, has drawbacks compared to the other major APIs and is absent on all platforms except Windows PC currently.
I don’t find the argument that Mantle is new so it should be supported to be persuasive. If what you’re looking for is an open standard across all platforms that is not controlled by a particular vested interest, OpenGL is it, and Mantle definitely isn’t.
that’s a really bad decision man trust me. Mantle is going to be the future. I didn’t even know it existed until today. I checked to see if there was new drivers because I’m playing Dragon Age Inquisition now and I noticed the beta driver. In it I saw the mantle thing so I looked it up. I ended up trying it and I got a 45% performance increase.this game in a lot of areas is brutal on a system and I have a nice computer. I still had to turn down a few things like anti aliasing shadows and a couple other minor tweaks. My frames went from 40 to 60.then I turned anti aliasing all the way back up, I retweeked everything back to full ultra, then I went into the Catalyst Control Center I’m turned my aa back up to edge detect and super sampling.before mantle this would have plummeted by frame rates to about 15. But its still at 60. Dude I’m telling you right now if you switch to Nvidia right now you are going to regret it. Don’t listen to what the Nvidia fanboys are saying about mantle it really is a game changer dude
This is most likely to protect NVidia from being accused of reverse-engineering or stealing ATI/AMD tech. They will want to keep their engineers at arms length from anything that is not clearly safe to look at from a legal perspective.
I think people forget that mantle might not be for everyone, BUT it did push microsoft to into taking DirectX back to the drawing board and to start working on getting more performance out it, Since Dx has been lagging behind performance wise for a long time with no optimization work being done on it. Suddenly there is viable competition and things get done!
AMD or Nvidia
Intel or AMD
DirectX / Mantle
Competition is good for us the Consumer no matter the pro’s/Con’s
Its in nVidias commericial intrests to put down anything their direct competitor, AMD, put out.
This is such a non-story.
Meh. I don’t care about Mantle either personally. The Intel/nVidia combo I am currently using seems to work fairly well for every game I play and it works well under Linux too.
Ok,
This may have been said, but I’m saying it. Nvidia has far superior card telemetry than Radion. So less work to make stuff efficient. You could always access low level functions on graphics cards, always, always, always. Nvidia engages better with developers, have done for a long long time.
AMD is crap, the entire chipset is crap unstable rushed to market always unstable poor drivers initially crap.
I’m all for competition but Nvidia Intel machine will be more stable, have less issues and better developer support for those reasons.
except now that mantle is out it’s going to change everything. I downloaded the beta driver just to try mantle with Dragon Age Inquisition and I had a 45% performance increase.Nvidia fanboys like you need to stop downplaying this monumental technological discovery just because you can’t use it with your current build. Intel is far superior to AMD in the processor department. I use an i7 myself but when it comes to video cards now that mantle is out changes everything
It’s not just about performance though…..
Currently, in regards to DX, they are completely in the hands of Microsoft in the direction and progression of the API. If Microsoft decided tomorrow to lock down Windows 9, enforce App-store like installations only, and charge extortionate developer licenses to anyone wanting to make a game with DX, there’s nothing to stop them.
This is the exact reason why Valve are investing in SteamOS.
AMD are wanting to create an open standard which is platform agnostic, which is created by and for the video card manufacturers and their developers, which they aren’t locked into platforms on. And nVidia don’t want any part of this?
It appears nVidia can see the writing on the wall and are digging their claws in to protect their business model. It wont be much more time before discrete video cards are no longer required, at which point nVidia will have no product to serve the x86 PC community. I’ll put my money on it too…. 10 years from now, nVidia will be an incredibly niche product in the [x86] PC world, they will have to put all of the hopes that their ARM division can keep them going [in all fairness, their arm products are fantastic and is where their future lies].
AMD are wanting to carve out a piece of the pie for themselves in the face of 6 years of declining market share. These are both companies interested in protecting their bottom line, there’s no ‘good guy hardware maker’ in this story.
I’ll take you up on that bet. Integrated graphics will never provide the power to run each generation’s high end games because the bar is constantly being raised and that’s not going to stop happening in the next 10 years. Hybrid chips (eg. APUs) may end up becoming dominant, but who manufactures those? Nvidia and AMD, primarily. The market isn’t vanishing any time soon.
If you think discrete graphics cards are the only thing Nvidia makes, you should probably brush up on their product lines.
nVidia don’t manufacture a hybrid x86 chip. That’s all intel and AMD. They manufacture ARM based SoC’s, which I acknowledged as being their current/future dominant area.
Which model discrete sound card do you currently use? Which FPU daughter card and memory controller?
Yes, graphics cards are getting more advanced, but like all reality synthesising units [sound cards, graphics etc], you get to a point where progressiong becomes logarithmic. You can’t get any more realistic than reality.
No one thinks twice about using onboard sound [even software sound], because it produces sound of enough quality to be adequate for almost everyone. As chip tech advances, it is only logical for components to merge onto the one chip. We already have SoC’s for mobile devices, and the desktop space is moving in the same direction.
Maybe 10 years isn’t quite enough time for this transition to take place, but I’d still argue it is a good ballpark. Within this approximate time frame, purchasing a discrete video card will be like purchasing a discrete sound card today……. mostly unnecessary for 90%+ of PC users.
And yes, all companies will be in the business of making money, I don’t for one second believe that AMD is the second coming and is above everyone else. But that doesn’t mean that there can’t be any difference in how they progress the technology. From the perspective of a developer and consumer of said technology, there is a clear distinction between AMD and nVidia. It’s not just about clock speeds and 3D mark scores, it’s about the direction in which each company is taking the industry, of which nVidia seems to be staunchly protecting it’s current model. Which, hey, they have every right to do, it’s just boring from my point of view 😉
My apologies, I missed the x86 part of your post. I think it’s only a matter of time before Nvidia puts out an x86 APU, they have more experience than AMD in APU building at the moment from the Tegra.
As for what discrete sound card I use, I have a Xonar Essence STX, looking at maybe the Phoebus in future if I can get close to a demo model to test its headphone amp. But it’s somewhat moot, sound is a limited medium with a hard upper limit on perceptible quality (48KHz, 24 bit, to give some leeway) and audio is a fairly easy thing to mathematically process. Graphics have no upper limits thus far and the complexity of a scene being rendered within any upper limits that did exist can be increased exponentially.
I stand by my original statement to take you up on your bet. Miniaturisation is reaching its limits with the number of circuit pathways that can be placed within a certain surface area. Currently it’s not possible to fit the chip space needed for high end graphics processing (as seen in the late 700 or upcoming 800 series Nvidia cards, for example) onto a motherboard that is already fairly busy with other tasks. Eventually, technology will have advanced to the point where something like that CAN be included on a motherboard, but it won’t happen in the next 10 years.
Lol, of course I bloody end up in a discussion with someone who actually has a discrete sound card :p
Yeah either way I’m interested to see which way it goes.
One other thing though, what are your thoughts on the current-gen consoles? The PS4 and XB both are now using APU’s with integrated GPU’s…… are they not demonstrating that integrated systems are capable enough for mainstream gaming?
Granted, they are closed platforms and thus can have the effort required to get more out of them than the same APU in a PC environment, but I think it’s a great demonstration of where the technology is at already.
And if Mantle can bring a similar development process to many platforms, Windows included, it sound like something lots of people could benefit from [accept those who want to keep business on windows PC’s only :p ]
I put APU in a different class to integrated. APUs are more like hybrid CPU/GPU cores put together, whereas integrated graphics is more along the lines of Intel’s offering onboard motherboards. APUs may rise in popularity but it won’t be because of consoles, but rather mobile devices. I don’t think they’ll surpass separate units in PC gaming though, mainly because combined technology is less flexible – right now I can upgrade my GPU or CPU separately, but combined I’d have to fork out for basically both units just to upgrade one.
The other problem right now is, obviously, that they’re grossly underpowered and highly specific in their design compared to separate hardware. That can change, but I don’t believe APUs will ever outperform separate hardware simply because of the engineering restrictions involved.
If Mantle had better hardware abstraction, I’d appreciate AMD’s efforts better, but from experience and the history of games development, low level access is really only valuable on standardised platforms like the consoles. On PCs where every bit of hardware can differ, it’s a nightmare to develop with low level APIs. The whole reason DirectX was made was to solve that problem. It’s not that I’m loyal to DirectX, but I am loyal to good hardware abstraction.
I agree with the analogy, screens will get bigger but our hearing range will not. They’re are defined limits with sound.
Plus sound cards weren’t add on cards because they were big, they were add ons and not considered a necessity for a pc like they are today, so you bought one if you needed one.
This is a load of rubbish. I tried mantle with thief (which is a pretty well optimised game) and got a 10-20 fps boost on the benchmark with my 7950. I miss mantle now I have a 780 Ti as it reduces CPU load ( I only have an overclocked 2500k). A lot of games cap because of CPU load. I’m looking at you watch dogs.
As a long term Linux user I’m going to agree with Nvidia. ATi/AMD has had 15+ years to get OpenGL working and they haven’t. I don’t trust ATi/AMD’s engineering teams to competently pull off another 3D driver. If they could make a non-crashing high performance OpenGL driver, then I’d take them seriously. This is something that Nvidia has had for years, 3DFX had, and Intel came along and achieved the same thing very rapidly. Only ATi/AMD seem to struggle with it.