Direct3D 12 Vs Mantle: AMD Clarifies Benefits And Differences

Last week, NVIDIA made it clear that it's not interested in what AMD is doing with Mantle — the GPU maker is more than happy with the improvements the next version of Direct3D will bring, with the performance tweaks in particular similar to those Mantle offers. It's only fair AMD has a chance to defend itself and its fledgling graphics API, which is precisely what it's done in this 21-minute, talking head-heavy video.

In the clip, Dave Nalasco, technical manager for AMD, does his best to outline what Mantle offers over Direct3D 12. Honestly, he doesn't do the best job, with his angle being that AMD's option has a "slightly different audience" and while Microsoft is focusing on broad compatibility, AMD is all about performance. DSO Gaming's John Papadopoulos scavenged the following quotes from the talk:

So DX12 is a major update and one of the interesting things about DX12, based on the information that Microsoft has provided so far, is that a lot of the goals they have for are clearly very similar to what we're doing with Mantle ... Mantle is all about trying to give tools to developers who want to extract more performance out of a game at any given situation, whereas DirectX is primarily about getting a broad set of compatibility so you can run your code once for DirectX and have it run on all of the different PC hardware that is out there.

Nalasco concludes with the belief that "a lot" of developers will "definitely" implement Mantle and Direct3D, though AMD is committed to supporting both APIs regardless of uptake.

The main advantage AMD has with Mantle over Microsoft is that the former's API is available now and it's only going to be more refined and battle-tested by the time D3D 12 games start landing at the end of next year.

Sure, Microsoft can guarantee a crapload of exposure for its tech before it officially lands, but by then Mantle will potentially be available in any number of games — and compatible with non-AMD GPUs, in the unlikely, though possible event NVIDIA changes its stance towards the API.

Everyone is also quick to forget Intel, which has the lion's share of the PC graphics market by a fair margin, thanks to its fast-improving integrated hardware. Though last we heard the company received a lukewarm response from AMD when it asked about Mantle and Intel's priority will always be compatibility over performance.

There's no question Intel will support Direct3D 12, but AMD would be wise to get the silicon giant on-board, despite what's happening in the CPU space.

AMD On Mantle & DX12: Both APIs Can Co-Exist, Mantle Benefits Both Developers & Gamers [YouTube, via DSO Gaming]


Comments

    with trying to make things easier... they are just splitting it and making it harder (time consuming).
    It could be the death of them. only time will tell.

    Last edited 03/08/14 1:15 pm

    DirectX is primarily about getting a broad set of compatibility so you can run your code once for DirectX and have it run on all of the different PC hardware that is out there.

    "Hey programmers, did you know that DirectX 12 will allow you to program a single set of instructions and then not have to worry about compatibility issues on a wide range of hardware, functioning like the interface layer DirectX was designed to be from the start only this time it'll be even more refined and efficient since it's a newer version?"

    What a great advert for DirectX, good job AMD...

    Last edited 03/08/14 2:06 pm

    Considering the never ending driver issues I have had the last 13+ years on my constantly upgraded AMD graphics cards I think AMD will be wise to focus energy there.
    I have not had a nVidia card since the Riva TNT2 and to be honest I think I will be going to nVidia with the GTX880's when I can get a pair.

    I get every 2nd generation and I dunno why I was such an AMD fanboy. Besides price to performance it has never been worth it. The amount of PC gaming I do and money I spend I am better off paying more for top end nVidia and getting a rock solid experience.

      Over the course of 5 years that I've been with AMD, I've only had trouble with RAGE. I have 200 games on my steam library. Played them all (not finished all obviously). Guess YMMV.

        Wolfenstein New Order had issues needing new drivers but never got crossfire.
        Watch Dogs still has massive stuttering. Syrim and Serious Sam 3 could not use (at the time) the latest 3 driver releases otherwise BSOD, you had to use older ones. Far Cry 3 massive issues for about a month. Deus EX Human Revolution goes bonkers with flickering on the 2013 drivers. Lords of Shadow 2 has flickering with every release. The list goes on and on. It is not a YMMV scenario as such but more of a, If you get big releases day 1 and constantly update drivers to have eyefinity, frame pacing and crossfire scaling work less crappy then AMD is a bad experience since GCN architecture released.
        Single card AMD are OK, pairing - terrible.

          It probably is a case of single card vs crossfire. I've played all of those games on release day with no issues (yep, even watch dogs didn't have too much stuttering either). But crossfire I would imagine would not work well, then again SLI doesn't fare well in most games either.

          That's not to say that issues don't exist on other AMD setups, but the distribution of issues shared between both gfx companies are probably closer than imagined.

          Last edited 07/08/14 12:42 pm

      It's been a mixed bag for me. I haven't encountered any driver issues, except for when I swap from nVidia to ATI the nVidia drivers cock block the AMD drivers, but that's fair enough.

      Other than that, smooth sailing with AMD but 3 out of the 4 recent nVidia cards I have had stopped working. They just gradually stopped being recognised as nVidia cards and only come up as a 'standard vga adapter'. They were in series known to develop this issue; a Geforce 7200, 8200 and 550ti (The one that never gave a days trouble was a 220).

      Anyways, I'm 'vender agnostic'. I just buy what ever has the best price to performance when I'm shopping. Last time happened to be AMD, I think I'll go green next time to shake things up.

        Not sure I'd call any of those Nvidia cards 'recent', the 500 series is the youngest and even that's 4 years old. In any case, I hope you have a better experience this time around if you do go Nvidia for your next card. The 780Ti is a rock solid performer as a second generation Kepler card. The 800 series uses first generation Maxwell architecture so there may be some initial issues, but they tend to get ironed out fairly well these days.

          Out of MY recently owned cards... I had the 550ti before the card I have now and only upgraded the end of last year. I had only owned the 550ti for just over a year. I'm not completely sure what how old the cards are has to do with my comment anyway.

          If I go back a bit, not the whole way, my list would pretty much go: ATI OLD AS SHIT, Geforce 7200, 8200, 220, 550ti and now an ATI 7870. (As you can tell, I went through a pretty heavy nVidia period just because they A) Offered neat features, and B) were cheaper where I lived)

          Sorry for any misunderstanding, but they are my 'recent' nVidia cards, as in ones I have owned recently. And just because I had ones that started becoming intermittent after 6-8months, (but not faulty enough to return them. Every time I was just about to, they would start working. I just figured it was a driver issue until I found out otherwise), doesn't mean that I'm saying 'every' nvidia card would do that... I do have shitty PC luck, I was just stating my experience. As in, to say people have bad experiences from both sides.

          The reason I want nVidia next is just because I can't get behind a re-branding of the card range I already have. Mostly because of the heat and power consumption, it's why I got the card I have now. I bought the 7870 to 'tide me over' till they move to the smaller fabrication process. In that regard, the move to the smaller manufacturing is taking painstakingly long lol.

          Last edited 03/08/14 8:37 pm

    I really wish AMD had spent the money just getting developers to tune their game for AMD GPUs before launch.

    I can't see Mantle becoming widely adopted (in the sense of "adopted by developers with inhouse game engines on multiple platforms"), simply because it's one more thing to implement, debug and maintain, and the benefits are marginal and/or only applicable to specific instances of bottlenecks.

    I expect it'll end up in a few of the third party engines (Unreal etc) because those are all about selling the engine to developers, so more performance/features are always a pleaser.

    The last in house developed game engine I worked with supported PC, X360, PS3, Wii and iOS.. and Mantle only supports one of those platforms. Throw in XBONE/PS4.. and you still have one supported platform. (From what I've read since the announcement, Mantle is not what is actually on either next gen console.. though it's similar).

    Translation: AMD Suck ass at writing compliant drivers for OpenGL and DirectX and want to push the market to using their product. Meanwhile Intel, Nvidia, OpenGL, and DirectX all work just fine without crappy Mantle and AMD. AMD should pull their head out of their ass and spend real money on making their drivers API compliant, and stable with OpenGL and DirectX.

    Actually it is more a sense of cost, as most devs will NOT set up the added software fort either Crossfire, nor AMD in general, as they HAVE to have a diffrernt software or face legal hassles. Kinda like the big companies still using win xp cause they don't want to spend tyhe money or the time.

Join the discussion!

Trending Stories Right Now