PS3 Version Of Rage Has Some Catching Up To Do

What's this? A predominantly PC-oriented developer (in this instance id) having trouble getting a game (in this instance Rage) running well on the PS3? You do. Not. Say.

In what's becoming a frustrating norm for PS3 owners, id's John Carmack has told Edge magazine that while the PC and 360 versions of Rage are running at a smooth 60 frames per second, the PS3 version is managing barely half that, clocking in at only 20-30 fps.

"The PS3 lags a little bit behind in terms of getting the performance out of it," Carmack explains. "The rasteriser is just a little bit slower - no two ways about that."

"The RSX is slower than what we have in the 360. The CPU is about the same, but the 360 makes it easier to split things off, and that's what a lot of the work has been, splitting it all into jobs on the PS3".

Shame. You'd think nearly three years into the console's lifespan, someone would have figured out a way around this by now. Unless, you know, there is no way around it.

Carmack: Rage runs faster on Xbox 360 [Edge, via CVG]


Comments

    Well slower hardware is just slower hardware I guess. The cell is a nice piece of work but the RSX really is just a tweaked 7800GT. The Xenon GPU in the 360 is based off the tech that ATI put into the 2900XT, which in itself was a pretty miserable card compared to the 8800 series but it was still quicker than the 7800 chip.

    Still I'm quite suprised, with this whole mega texturing thing that Carmack has built up with iDtech 5 I figured the wall in this case would be the amount of availible system memory on the PS3 not the GPU. Why Sony never put 512megs of system memory in the PS3 is really beyond me.

    Still I cant argue with the stability of my PS3. 1 PS3 vs 4 360's, proper hardware QA for the win.

    In order to get the PS3 GPU running at the same level (or better than) the 360's, Cell SPE's have to be used for graphical preprocessing.

    Unfortunately, Cell being a relatively novel architecture, Carmack is having trouble getting id tech 5 to outsource its tasks. This is the danger in novel architectures.

    Ironically, Carmack himself stated that the PS3 was theoretically more powerful and would eclipse the 360 if programmed correctly. Unfortunately, programming the PS3 with id tech 5 must be a considerable challenge to say the least.

    If only Sony got over their fetishistic love of assymetric clusters of relatively simple microprocessing units optimized for vector coding. It would save developers a lot of pain.

    I'm sure that Carmack will get the FPS stable before the game gets released.

    The lesson that should be taken away from this is that theoretical power is useless if it is not easily accessed by developers. If developers are used to technology that works in "way X," than a hardware company should release a device that works in way X. It should not demand they start learning way Y.

      "I’m sure that Carmack will get the FPS stable before the game gets released."

      You make some very good points about the technology, and the problems adapting to them (especially when Sony actually BRAG about the difficulty involved) but I'm not sure Carmack & Co will be able to get a decent, stable FPS.

      If you have look at most PS3 games, you'll notice just how many of them don't have stable framerates. Thanks to speed tradeoffs and vsync policies, many PS3 games jump around all over the place. Not to mention tear their little hearts out.

      Also, there's a growing notion out there that many of these big development problems will get ironed out closer to release date. This usually doesn't happen. Games generally do not get substantially faster once out of beta. Prettier, more refined perhaps, but fundamental performance issues? No.

      As for Killzone 2, it's not really a good example. First of all, it's a game specifically targeted for specific hardware. It's a game with a lighting model that suits a mostly indoor or closed-off environment. Its draw distances are quite modest. It doesn't have HDR, but the lighting model it does have suits what it needs to do. (You will note, however, that the lights fakery shows itself when you go around corners or through doorways.) It has a lot of model and texture detail, but it's also extremely repetitive and does not feature a variety of terrains or enemy types. And, worst of all, it struggles to maintain 30fps. As Digital Foundry showed, the game often spends much of its time hovering around the 25fps mark, and much lower in (less important) cutscenes. Finally, because of its use of a deferred renderer and the limitations of the PS3's gpu, it doesn't feature a lot of alpha overdraw. (Note the lack of layered glass, or large-scale particle effects outside of cutscenes.)

      Put another way: they spent years and god-knows-how-much money making the game, yet it's very console-specific and is still dodgy in the framerate department. It seems a bit churlish for anyone to expect a cross platform game to do any better, especially one that hasn't been bought lock-stock-and-barrel by Sony.

      And on that note: Ghostbusters PS3 anyone? Now there's a frame-rate debacle, from a developer that bragged about the PS3, no less.

    Hold on, hold on, wait. . .
    Wasnt it not long ago id were whinging about the 360 not being adequate for Rage and praising the PS3?
    I guess they were still early in development and hadnt realised just how much trouble the PS3 would ultimately be.

    How the mighty have fallen, i remember JC being THE GURU who would test every video card on the market to make sure it would work with 'his' games, and if not he'd code more to make them work, now he's just another programer using a MS dev kit and then trying to port over to the ps3, which we know from previous attempts (GTAIV, FO3 etc etc), does not work, let me say that again, it does not work.
    I really expected more John.

    & of course it runs at 60 fps on the xbox360 (because they would have set this as a specification), and any decent PC can do 60fps with 360 port (especially if you dont bump up the res or eye candy), so unless they're talking 60 fps @ 2048x1536 with full AA & AF (and i'd like to see that plz /drool) on a sub $1000 pc then that is nothing to brag about.

      You expected more? Like what? Building the PS3 version of Rage from the ground up? And bragging... good lord there is no bragging in there. Armchair programmers like yourself are the problem plaguing the Kotaku's US portal.

      @warcroft: "Whinging"? They stated that the 360 version needed to be spread across two discs. That isn't whinging, it was just a fact. No, the 360/PS3 fans are the ones that started the whinging in response to that with the Sony camp proclaiming it as a Bluray triumph. Now that situation is reversed.

        @Alex
        i didn't expect JC to just roll over and play dead, he WAS a programing GOD, and no i don't expect a ground up build just for ps3 (either its too hard or they're too soft obviously) & I'll just play it on my PC at a decent resolution and with controls that dint need aim assist to be accurate.

        FYI i couldn't bother reading the US portal, because I'm an Australian...

        Im now begining to wonder how this counts as NEWS.

        @SammyC

        "i didn’t expect JC to just roll over and play dead, he WAS a programing GOD"

        Still is dude, still is. Programming gods don't just forget how to program or design, after all. And I'd bet my house he could program rings around you. ;)

        If they're having trouble getting a good framerate on the PS3, then Carmack is the very last person you could possibly accuse of laziness or inability. The man's achievements are there for all to see, and he knows what he's talking about.

        If you really want someone to blame, contact Sony. They've openly bragged about the PS3 being difficult to develop for. Not much to brag about, I would have thought...

    So technology should stay exactly the same as it is now? Nobody should ever push the boundaries again, or try to make something new & different?

    It's how technology progresses. Somebody somewhere needs to take a risk every now & then, do something a little bit out there. Sometimes it pays off, sometimes it takes a while to pay off, & sometimes it falls flat on it's face in the mud.

    If technology companies didn't ever try anything new, there'd be no Blu-Ray, there'd be no directx 10, there'd be no shader model 3 on graphics cards, there'd be no hard drives in games consoles, there'd be no Wii with it's funky controls. All PC's would still be programmed in Plankalkül from the 1940's.

    Just because the developers are used to working in a certain ways doesn't mean they can't or shouldn't learn something new. Sometimes new turns out to be better than old.

      I never said the technology should stay "the same." You seem to be missing the distinction between an ADVANCEMENT of an old technology (DirectX 10, for instance, is an advancement of previous versions of DirectX) and a completely new technology.

      The distinction is "evolution versus revolution." The former (small incremental change and improvement) is something that developers can easily adjust to. The latter requires retraining developers, which takes time, money and effort.

      Yes, doing something "a little bit out there" is wonderful. The problem is that Cell is not "a little bit out there" but "requires a completely different programming paradigm!" The existing skill set of developers has to be DISCARDED and a new one developed. You aren't simply 'adding new techniques to the old programming model,' you are creating a totally new programming model.

      This creates learning curves and higher development costs.

      If Sony built on existing technology with only some minor modifications and enhancements, plus Blu-Ray, they wouldn't have this problem. But no, they wanted to invent and entirely new CPU architecture (even if Cell is derived from POWER, the SPE's are a totally different ISA), one whose primary benefit (according to Dr. Peter Hofstee, the CREATOR of the Cell chip) is efficiency-per-transistor and acceleration of vector programming, and have complex games made to run on it.

      Economically speaking, this is impractical to say the least. And Sony's dismal fiscal performance has vindicated my stance.

      "So technology should stay exactly the same as it is now?"

      Strawman argument.

      "If technology companies didn’t ever try anything new, there’d be no Blu-Ray, there’d be no directx 10, there’d be no shader model 3 on graphics cards, there’d be no hard drives in games consoles"

      You do realize that the guy in question -- Carmack -- has been the most loudest proponent of shader models for gaming, since, oh, before games could ever use shaders?

      Implying that Carmack isn't trying anything new is, well, it's ridiculous. The guy has been trying and CREATING new tech for his entire career. He's the reason we have certain technology and entire game types now. I'm not in love with what he does with that technology, but his passion for it is admirable.

        After all, Quake was built on the first TRUE 3D Gaming Engine. If anyone chooses to actually argue with this by bringing up doom, might i remind you, that this game was a 3d illusion based on a 2d environment.

        @Kaden101: Carmack is "Carmack" because he is known for trying new things.

        It's funny seeing some of the arguments a ps3 fanboy will use to protect his original argument about the ps3 "taking time". I think we've just kinda stopped listening...

    I'd say 60fps on a console is a good bragging right especially if its running at a native 720p with decent texture res instead one of those dodgy upscaled low res jobs. Most console titles these days lock themselves in at 30fps.

    I'm quite certain Carmack still has the mojo.

    Hang on wait WOAH. Back up ladies and gentleman. Carmack has just said "The PS3 is proving to be difficult to program on", not "PS3 is teh suxxor". He hasn't even said that he's had to retrain all of his staff because the PS3 is apparantly "A whole new programming paradigm". Jesus christ you monkeys can blow things out of proportion sometimes.

      You misinterpreted my post. I was not saying that Carmack, himself, had to retrain his staff. I was trying to explain to Kaden101 the downsides to radically novel technologies (specifically, learning curve effects).

      You are correct. Carmack never said that the PS3 sucked. Nor did I. I concede that potentially the PS3 is more powerful than the 360. Killzone 2 proves this. All I stated was that accessing this power requires a skill set which is substantially different to the skill set employed by PC and 360 developers (and technically Wii developers since Wii has a similar architecture to the 360: PowerPC CPU (actually a more PC-like one than the 360's) + ATI graphics chip).

      I think that by the time of release, the three versions will perform roughly identically. Carmack will eventually figure out the tricks to optimize id tech 5 for the PS3.

    Biggest fanboy topic of the closing week Kotaku.

    On the side note, id will get there... its just a matter of time and help. That's it. There is no doubt the PS3's hardware is more superior than anything on the market, what id are having problems with, as with a lot of third-party developers, is getting the help they need to be able to overcome these issues.

    Nothing great was ever easy, same applies in programming... the more you put into it the greater it will turn out. I just really wish developers would stop bitching to the media whenever they have a hard day at work and actually sit down and figure out a way to solve the issue instead.

    StudiodeKadent
    You actually stated that "If developers are used to technology that works in “way X,” than a hardware company should release a device that works in way X. It should not demand they start learning way Y."

    That statement alone at the end of your comment gives the impression that you think releasing a radically different system that requires learning something new is something that shouldn't be done. I think maybe you did miscommunicate what you really meant, as this was the tone of comment that I was replying to.

      I apologize if my tone resulted in a misunderstanding.

      Although I do think that releasing, onto the console market (not necessarily any other market), a radically different system that requires programmers to learn a very large amount of new practices, is not a wise thing to do (generally)(under the current market conditions).

      Consumers buy hardware to run software. Thus the obvious thing to do is to create a hardware platform that is easy to build high quality software on. This naturally would imply that the most reliable route is to use an architecture that developers are familiar with, and maybe make a few enhancements and innovations. "Update" and "evolve" and "improve" rather than "revolutionize," simply to minimize the learning curves.

      You are correct, my tone probably did make it easy to read a stronger meaning into my argument than originally intended. I'm not decrying the invention of new technologies. I'm simply saying that unleashing such radical novelty on a consumer market with relatively short notice is not an economically sound thing to do (for the firm making said device), when the sales of said device are critically dependent on complementary goods (i.e. games for consoles).

      Apologies for any lack of clarity on my part.

        No worries, & I get your reasoning. Whilst I don't think Sony did anything wrong releasing a radically different product, I do understand that anything different requires a learning curve, some more steep than others, & in turn this has caused delays to 3rd parties coming to grips with the technology. 1st & 2nd party developers have got more out the PS3 simply because they have spent more time exclusivly with it.

        I will just say one thing regarding your comment on releasing it under current market conditions. When Sony were doing the R&D for the PS3, & even when they released it, we weren't in a recession (far from it). It will still have been more financially viable to release the PS3 as is rather than spending more time & money on R&D to make easier to work with hardware. It's a pity that it's taken this long for 3rd parties to be getting close to understanding the needs of the PS3's architecture, but that's the reality of the situation.

    This may just be an early development stage in the game, wiht the PC and X360 just being more familiar to ID, and therefore optimised faster. The PS3 may just require more time, and some Sony engineering teaching ID how PS3 programming is done.

    This does sound similar to the announcements Bethesda made about the PS3 being troubling to program for the Oblivion conversion, but that game came out out argually BETTER than the X360 version.

    I am also hoping that you CAN teach an old dog new tricks......

    More lazy developers that cant be bothered

    How do you get a pc running at 60fps?

    Hahahaha, i find it pretty funny that all those ps3 fanboys based their arguments on the ps3's need to "catch up". There argument went something along the lines of "give the ps3 a few years, and we'll come back with an incredibly powerful machine that will kick the 360's arse!". Well it's been just over 2 and a half years now guys, and we're still waiting...

    Its the same arguments for the PS3. . . "its more powerful, developers just have to learn to program to it, Cell Cell Cell"

    Well, maybe the PS3 just isnt as great as everyone has been lead to believe.
    I mean, how long does it take?
    How long are we going to hear these arguments for the PS3?
    The PS3 just isnt as good as its made out to be.

    Womble made a great point about Killzone 2. Its what Ive been telling people since its release. Its all smoke and mirrors.

    Devolepers SHOULDNT be having so much difficulty making games for a console. It only encourages half arsed ports.

    Simple fact is, some engines just run better on 360 over ps3 and vice versa. Their's no reason people should get annoyed or angry about this, its how programming works.

    Not to mention carmack and his company are very much pc devs and 360 is much more a 'pc' than ps3 is hardware wise so this isn't any surprise at all really.

    That's the problem with ps3 programming, unless you've got people who know how to do it your kinda skrewed.

Join the discussion!

Trending Stories Right Now