Lack Of Innovation In PC Graphics Hardware? Blame Intel, Says Epic's Mark Rein

These days, only a few triple-A games really stress out modern 3D hardware. With the plethora of less-demanding indie titles making up a fair chunk of the average gamer's library, grabbing a new video card isn't the regular ritual it once was. But is there something else to blame for the slowed pace of GPU innovation? Epic head honcho Mark Rein thinks so.

Recently on Twitter, Rein posted the following messages:

Sadly, Rein doesn't provide a timeframe as to when these continual attempts at convincing Intel took place, but Epic has been closely involved in the evolution of graphics hardware for a while now, thanks to its popular Unreal Engine middleware. So you'd expect Rein knows what he's talking about.

That said, it may have been the case in the past that Intel hasn't made much of an effort to push its GPU technology as far as the likes of NVIDIA and AMD. But why would it? As Rein mentions, the company dominates installations, thanks to integration and for desktop activities, browsers and video playback, Intel's hardware is more than adequate for a majority of users.

The other thing to keep in mind is that Intel's priority with its GPUs has never been performance, but power consumption and heat. Being the choice of hardware by almost all notebook manufacturers is not a mantle to be treated lightly. While NVIDIA/AMD can be less stringent with these requirements -- its hardware is completely optional -- for Intel, efficiency has to be at the top of the list. True, there's definitely a balance in there, but I can totally understand why "good enough" would satisfy Intel and... its paying integrators.

On top of this, Intel did try to advance its GPUs with the ill-fated Larrabee and the acquisition of (at the time, visually impressive) Project Offset. The endeavour backfired on Intel and it didn't talk about its graphics for a long time.

That's until Haswell and its "GTe" hardware (now known as "Iris"), which has shown itself to be a very capable GPU for its intended market -- notebooks and tablets.

These days, Intel's actually doing quite a bit in the 3D space, from doubling and tripling the performance of its integrated GPUs, to tabling new extensions, such as PixelSync and InstantAccess, to the Direct3D API to improve visuals and performance.

So, Intel may not be on the forefront of GPU technology, but it has most definitely picked up the slack in the last couple of years. Whether this is through choice, or it's been forced thanks to the gradual offloading of common graphics (and some non-graphics) tasks from the CPU by modern operating systems, it's hard to say. But I look forward to seeing where Intel takes the industry, now that it's thrown some of its substantial weight behind the task.

@MarkRein [Twitter, via DSO Gaming]

Graph via Anandtech / Intel


Comments

    Nice graph Intel. Now change to a logarithmic scale and show where AMD and NVidia lay.

      ^this.
      All well and good having double the performance. But not when you get 10FPS to start with.

    It seems to me like the problem is more with the developers who are content with putting out less demanding games, or half-arsed PC ports of console games that don't really take advantage of the PC's extra power.

    There are awesome video cards on the market, but there's very few games that actually take advantage of all of their features.

      I "up voted" this, but also consider;

      Few games take advantage of the hardware on offer, because they are console ports?
      Perhaps.
      Crysis certainly didn't *start* on the consoles, but that's where it's ended up.

      However, maybe that's the problem inherent, that if Intel think their stuff is "up to scratch" with the current console generation of games, it's the reason they haven't seriously tackled this "issue" and therefore the cycle repeats - developers build for what sells best?

        Few games take advantage of the hardware on offer, because they are console ports?
        Perhaps.
        Crysis certainly didn't *start* on the consoles, but that's where it's ended up.

        Yes, I did consider that, but honestly, if a PC port is done properly, they provide graphic options to pump everything up to max if your hardware supports it. It's very lazy to not do that.

        The Assassin's Creed games on PC are a good example of PC ports of console games done right. They DO provide the options to seriously pump up the visuals, and when you do, the games look a TONNE better than the console equivalents. Seriously, it's hardly even close. Ubisoft may get things wrong but their PC ports are an example of one thing they do get right. Resident Evil 5 is another example of a PC port that quite impressed me. And this is something that all devs doing PC ports should strive for.

        Last edited 19/05/13 6:41 pm

        I have a laptop and a desktop (Hybrid) both with intel integrated chips (probably 4 or so years old now) neither is up to scratch with the current generation of games and both are probably fit to run Xbox 1 graphics and nothing more.

        Now I didn't get those PCs for games (I'm a console gamer) but still Intel are a disappointment and yes I think Rein is correct but it's also true that laptops are outpacing desktops in the PC market and yeah that would have an impact also, though a lot of the laptops I see on sale these days seem to have decent graphics cards installed.

          Have a look where 4 years ago is on the graph.

          I bought a new ultrabook yesterday with a HD4000 and was astonished at the gfx performance. I was unable to get a model that had a GeForce card in the form factor I wanted (XPS12) and had settled on the integrated default.

          Its amazingly ... competent.

      I agree. There aren't many developers who put the time in to do an over the top PC version of a game. It's money they're not willing to spend for the most part. Crysis was the best example. It started off being a game that pushed PC hardware to the limit. Then the sequels came along and they barely pushed the boundaries of anything. Thankfully there are games like Tomb Raider where the graphics are tailored to the PC. The majority of developers won't do this though.

    Sounds like Intel's business plan - move in, suffocate competition and stifle innovation while they cant compete fairly, corner the market when they can and then charge premium price for inferior products. All they need is a willing Microsoft to help them out.

      "Hello, Intel. It's Redmond. Yeah, hi. Look...we need a favour. We need you to flood the market with shite GPUs, but act like they're actually the best. Then, when people are pissed they can't play the games they want, bam! They'll come a-runnin' to the Xbox. Genius. Thanks."

    i'd say that my macbook air's tiny ULV HD4000 is pretty incredible, i played a shitload of portal 2 on it through an apple thunderbolt display before i bought a proper gaming PC, so they're not hopeless.

    but i would agree with anyone who rails against complacency... if the majority of people have PCs and Macs with integrated graphics any sane developer would make sure these people can play their game, so if this technology is stagnating/sucks/whatever (even comparatively), it brings everyone down. epic would be a bit screwed if they beefed unreal engine 4 to beyond intel's graphics in the hope intel picks up the slack, because i'd say the result is that developers would just stick with the current one, so i can get why they'd be mad about it.

      I have seen some rely good gaming laptops, but unfortunately none of them had intel GPU's... source engine on he other hand is a very CPU dependant game, so it doesn't need a very good GPU at all.

        Yep, this is why TF2 runs beautifully on my laptop and it doesn't surprise me that Portal two does either - Source seems way more scalable that Unreal or any of the big engines.

          Sure that it's not that Source is a ten year old game engine not lumbered by bloat caused by pandering to the consoles?

    Intel ARE good enough... For the people they are aimed at. Not everyone wants a HSV when a Barina will do.

    Many of their computers are going straight to people who do not require an over the top graphics card. Almost anyone with a gaming rig knows that using integrated graphics for their games is a horrible idea and they're going to immediately go for an AMD or Nvidia card. Even if they didn't know, as soon as a person says they want a gaming computer, they will be pointed to something with an AMD or Nvidia card it it. Those who buy a gaming computer with nothing but an integrated graphics card either don't mind not having the best graphics or were jipped by the sales clerk.

    Is Intel to blame? From a business standpoint, hardly. If there's no demand, there's no need to innovate, especially when the majority of your target market are not gamers who play graphically intense games. Sure, they could try to wrestle control of the gaming market from AMD and Nvidia but now we go back to the point of market share. They've already got most of the market and spending time and money to take the remaining share from gaming giants like AMD and Nvidia probably wouldn't be worth it. At least, not in this current age where the advent of consoles and indie games have stagnated the need to produce fancier hardware. I would definitely say blame the lack of demand first before blaming Intel.

Join the discussion!

Trending Stories Right Now