News Of Okami HD's 30FPS Lock On PC Has Not Gone Down Well

Image: Capcom

The issue of games with locked framerates is a touchy one, depending on who you talk to. If the game in question is on PC, well, odds are you're going to get a lot of grief from players who want to get the most from their high-end hardware. Unfortunately for Capcom, it was recently revealed that the Okami HD PC port will be locked to 30fps, much to the chagrin of fans looking forward to its December 13 release.

A Steam user by the handle of "iRumy3" made a post on Okami HD's discussion forum, containing a link to the below image:

Image: iRumy3 (via Steam)

The screenshot is a message from Capcom community manager "wbacon", which confirms that Okami HD's framerate is locked to 30 frames per second, primary because the title exhibited problems with "game logic, collision detection and [animation] speed in certain areas".

It goes on to mention that addressing these issues would require a "Hurculean [sic] effort" involving "trial and error".

Understandably, the statement caused a bit of a stir.

That said, Okami HD plays well enough from what we've seen, so it's debatable the lock will have much of a negative effect on gameplay.

The reality is bringing old games to new platforms isn't always straightforward and while we can curse the original developers for not thinking ahead far enough, sometimes you have to make do with what you have.

[Steam, via DSO Gaming]


Comments

    OK, full disclosure I'm just a vanilla programmer and not a game programmer. I have done reading into game programming but there must be something that is done in practice (thus rarely appears in theory literature) I am not seeing.

    If this was the 80s and even early 90s, I can understand having a lot of the timers, clocks, etc., synchronised to that in the machine itself (be it the IBM PC jr, Sega Megadrive, etc).

    But this is 2017. Surely there is room to give the game logic its own timer/clock and not have it hard wired to the refresh rate.

    Heck, I was even shocked to see first hard how this impacts Fallout 3. At 120 hurts, the game still plays but all the character animations finish before their audio tracks.

    Again, I'm no game programmer but I think it safe to say it should be unacceptable to have the game logic timing timed to the frame rate.

      There are a number of challenges. Yes, you would generally decouple the loops these days. Particularly for multi core systems. But Okami was developed in the PS2 era, so you more or less had a single main loop and people didn't care about 30/60fps by and large.

      The problem with lifting the cap is that your physics and collision code were developed and tested at a specific hertz. When you change the time delta that you're integrating by you change the results of the equations.

      Suddenly objects start falling through the world, getting stuck, or building up huge forces and firing off into the sky. All because you've changed your tick rate and made the simulation unstable.

      Last edited 10/12/17 3:56 pm

      It's worth noting that every re-release has been a visual refurbishment of the original 2006 PS2 version. Not sure how that factors in, but it's a consideration.

        Still doesn't make any sense though.

        If the game is to be updated to work on a new system with a completely different architecture, why carry over design elements that were influenced by the source platform?

          I edited one word in my comment and it's been sitting in moderation ever since. So there's a slightly different version of this that may show up at some time.

          Basically, though, if you change the update rate of a simulation you change its outcome. In games that means that the level design that was created with a certain frame rate in mind essentially falls apart. For example you may fall through holes you never could before.

          From the developer's comments it seems to me that they tried increasing the fps but the game became unstable. At that point you need to either play test and hope you find and fix all the issues or rebuild the whole geometry from scratch. Both options are expensive and time consuming.

          It's better to leave things in a stable state and improve what you can within your budget.

            But why can't I play on my 200hz 4K screen? WHHYYYYYYY?

            In games that means that the level design that was created with a certain frame rate in mind essentially falls apart. For example you may fall through holes you never could before.

            Again, why are such elements keyed to the frame rate?

            If one was going from say a PC jr to the first 386, that would be understandable as the only real clock was the CPU's own clock.

            But again, this is 2017.

            Rendering should be independent of the game logic. It should only focus on the 3D environment and the frame rate only consider the viewport/camera/whatever it's called now.

            At that point you need to either play test and hope you find and fix all the issues or rebuild the whole geometry from scratch.

            That still seems Rube Goldbergian: this has got to be at least the second or third port of the game. If they have that amount of money to do this so many times, they should have put the money in to modernise the engine.

            Don't get me wrong, I'm in the camp where game design and experience always come first and graphics is at most a second or third priority.

            But at the same time, the art of creating game engines also needs to adapt over time and what we are seeing is something that belongs back in the 80s/early 90s where the CPU clock was the only clock and everything was timed to it.

            Refresh rates should not have an influence on how fast an object falls to the ground, how quickly my timer counts down when I do a fetch quest, nor anything else.

            The resources needed to have an independent timer exists even in consoles these days so there is no excuse for not keeping such a separate timer that is completely independent of the graphics system.

            Among other things, this makes the developer look lazy and/or unprofessional (letting an 80s era problem dominate in this day and age).

              So, imagine you've fired a bullet from a gun. It travels fairly horizontally at first before gravity causes it to curve and eventually hit the ground.

              When you're stimulating that in a game you look at the position and velocity at intervals and calculate how the bullet moved between the previous frame and the current. Assuming simple Euler integration for now the bullet essentially travels in straight lines between frames. Instead of following a gentle curve it steps down quite suddenly.

              In general the more times you integrate the better you will fit the intended curve.

              Now imagine we've changed from 30fps to 60fps. We've halved the interval, or doubled the number of times we check against the curve. The bullet still hits the ground at the same place, but it follows a more gradual curve and actually stays higher in the air for longer than it used to.

              If you had scripted a turret to fire under an opening door the bullets could suddenly end up hitting the bottom of the door just because you've increased the frame rate.

              World geometry is made up of many different shapes placed up against one another. There are small seams and intersections between them. If your character moves at a fixed velocity then they may skip completely over these seams at 30fps. You check the position on one side of the seam, then you move the state of the world forward by 33ms and check again. The character is on the other side and proceeds as if nothing happened.

              If you only move the world forward 16ms, though, when you next look at the position the character is right in the crack. At this point they might be stuck, they might fall through, they might continue to apply forward velocity, build up massive forces, and then fire off into the sky.

              Even with modern games and improved physics you still often lock the simulation frame rate down to a fixed constant to avoid unpredictability.

              It's easy for them to put in higher resolution graphics. It would be difficult and time consuming, but still possible, to separate the animation and simulation updates so that you at least have 60fps animations. It's hard to increase the simulation frame rate without undesirable consequences.

                This is starting to go in circles so I'll make this my last post.

                Your posts seem to document what is being done. But the matter still remains: what is currently done/practiced is still wrong.

                Take your own example of a fired bullet. The simulation should only care about the passage of time and not care about the frame rate.

                There is a need to separate the 3D simulation/modelling from the rendering process. Rendering should be completely by observation.

                When a frame is to be drawn, it should capture the position of all models at that time and then do its work (texture mapping, shaders, etc) then check the environment again when the frame is rendered.

                The end result should not be a speed up of objects in the environment but smoother movement if more frames per second are rendered.

                That being said, I'm starting to think I have my answer.

                To keep the two separate, the 3D environment at that instance has to be cloned (thus taking up a lot of RAM). That way the graphics system can do its work without influencing the timing of the simulations.

                That being said, it does not invalidate my original point. The two systems - simulation and rendering - need to be decoupled thus I think more work should be in making an optimal way how rendering is fully independent of the simulation and acts in the same way as a real world camera: the frame is separate, individual capture of that simulations moment in time.

              But at the same time, the art of creating game engines also needs to adapt over time and what we are seeing is something that belongs back in the 80s/early 90s where the CPU clock was the only clock and everything was timed to it.

              *Actually* this is part of the problem. PS2 hardware was weird. The SPU and the CPU were not explicitly synced to the same time and basically relied on the fact they ran at specific frequencies, so I believe you had to do some work to manually keep everything in sync. Complicating things was the fact that there were two SPUs, one of them the PS2 chip and the other being the PS1 sound chip, and you needed to use both and they ran at different clock speeds.

              This was a huge problem for PS2 emulation for a long time - if the actual game dropped frames the sound would distort, run slowly or clip in some situations. I think eventually they solved this in PCSX2 by explicitly using a second core on dual core machines to ensure that the SPU and CPU emulation was running simultaneously.

              Also there's the simple fact that constant 30fps actually feels much better than something that swings wildly from 30-60.

                Also there's the simple fact that constant 30fps actually feels much better than something that swings wildly from 30-60.

                I'm not going to lie, I'm also for consistency rather than having frame rates vary wildly.

      Having 'timers' like you describe is very much a GUI paradigm, not what's used in games. Games, especially for older hardware, tend to be single-threaded and very tightly controlled because it's easier to optimize.

      Games generally have a simple update loop. You render the current frame, then update the game state based on the elapsed time since the last frame was rendered. Updating the game state will usually poll for input, update entity positions and animations, change AI states and so on.

      I imagine the reason they went for a 30fps lock was to simplify a lot of their calculations, which might have been necessary on the PS2 to keep it running smoothly - I remember getting frame drops in some areas, and by having the update loop directly tied to the framerate the game simply slows down a little, usually imperceptibly unless the drop is significant. They likely decided that slowing the game down in frame drop situations was preferable to having time-based updates, where instead of feeling slowed down, the game becomes a slideshow where it's much harder to control because the state is updating at the same rate so everything moves much more in a single frame.

      It's also likely that they knew the exact target framerate and optimized everything in the state update to save cycles since the PS2 wasn't a powerhouse by any stretch, and they were likely using a lot of cycles to do the Sumi-e art style.

        Having 'timers' like you describe is very much a GUI paradigm, not what's used in games.

        Don't know about timers being GUI centric. I was going by the idea of each system having its own clock/frequency thus one system going faster doesn't affect another.

        But I will admit, most of what I know comes from the book, Game Programming Complete by Mike McShaffry and even then that was a few years ago. I think the book used the term timers back then but I can't remember unless I dig the book out again.

          Ah, I assumed by "timers" you were talking about actual Timers which are common in C# and Java programming. Basically "after this number of ticks, call this function", so it was essentially callback-based. Designed for GUI stuff but you could use it and often do use it in Java for some styles of games.

            Ah, I assumed by "timers" you were talking about actual Timers which are common in C# and Java programming.

            Gah! I forgot about those! I was talking about timers in general, not the C#/Java implementations.

            Sorry about the confusion.

    Yet another advantage of being a Nintendo fan: We were never "trained" to care about FPSs, resolution, polygon count, whatever. Fun and playability have always been the only key factors to enjoy a game. I cannot imagine how it is to be soured by an announcement like this.

      What does your colon smell like bud? You must know after all with your head so far up there.

        Keep it coming, buddy. Angry, butthurt outbursts from haters that cannot keep it inside their pants even when nobody is attacking them amuse me like few things can.

          Im still puzzled as to how you can type comments on here. Your head seems to be going even futher up your own rectum. You are a medical miracle.

      So you're used to disappointment and delays then?

      Climb down out of your own ass and see that Nintendo isn't the golden child either.

        Wow, did I touch a nerve? I'm not saying that it's any kind of "golden child" or anything. of course it has its flaws and faults. However, if you cannot see anyone else praising something good about something they like without feeling personally offended, I'd say you're the one who needs to climb down from nether cavities.

      I can slightly agree to this... I mean, playing those 15-25fps 3D games in the late 90's / early 00's makes it easier to put up with 30fps games of today, although if you're gonna revamp a game for 2017 then at least put in all the effort for having unlockable frame rates.

      But in saying that, it's a great game and no one will be missing out, no cut content or micro-transactions, etc.

        You probably worded it a bit pretentiously ( real word ?) but yeh I agree. PC gamers just wanna use their hardware to the max but really, they should probably be thankful they're getting what was a ps2 exclusive from what, 2006?, to begin with.
        I dunno I haven't been a pc gamer for years, I am just used to whatever scrappy fps we get thrown lol

    So in other words they have a goodish explanation as to why but PC gamers are acting like they always do....childish just childish.

    And just saying but a game does not need to be in 60fps to be good now if it had trouble hitting 30fps then sure but come on just cus you blew a ton of cash on a 'Ultra high spec gaming computer' does not mean that game devs need to bend to what you want

    It was a few years ago when the consoles started running essentially pc hardware it would be good because of more ports. But pc ports are now getting the raw end of the deal because of console hardware. This isn't the first game that's come in pc as well as console that's locked to 30 frames and it certainly won't he the last.

      The PS2 and PS3 certainly did not have PC hardware so I don't see where this argument is going.

        (The PS4 was a few years ago. Also, time keeps on moving like crazy. You're probably thus old, now. Congrats.)

          Ah, I missed that it was on the PS4. Good catch.

            Well, I don't think @guestwhowould was strictly talking about Okami specificially anyway, but rather using the topic as a springboard to muse about the general optimism of PC gamers when it was revealed that the 'next gen' (xbone and ps4) were basically using PC architecture.

            We all figured, "Oh, maybe we'll finally get good PC ports, since it's so similar!"

            Unfortunately it's still a crap-shoot. Lotta shitty ports still coming out for PC, hanging on to PS4/bone fps caps and other design considerations.

              Yeh but us console only players are still getting shitty ports of what are ostensibly pc games too right? It's just we've learned to play our games that way. PC players still have so many advantages, like one game for several "generations" whereas we have to keep on repurchasing. Or the mod scene and the crazy ass shit they have going on. Or the constant steam sales. We do get the shorter end of the stick and hey I'm lazy that's fine with me. But maybe pc players just complain more lol, or is that gamers in general these days? Yeh its just gamers I guess. I mean I am constantly bitching about indie exclusivity so yeh guess I'm a whinger too.
              I feel like I had some sort of point to my post but nope it's gone now lololol

              I would be okay with a Stalker port for ps4. Even Planescape or Icewindale would be cool to see.

    As a game developer, any game that doesn't correctly use an "elapsed time / game time" concept is just lazy programming.

    The game should be able to be run at any framerate (within reason) without impacting physics, etc. unless horrible shortcuts were taken during development - and if that is the case... why are we paying money for lazy hack jobs?

    This kind of thing annoys me. Boycott.

      As a game developer, shouldn't you know that linking time and other functions to the framerate was common practice for a reason?
      Or that Okami was developed in a time where future viability of this scale wasn't even a concept in gaming industry?
      ( And on a side note, isn't the entire field proud of the fact that many of the most iconic titles and game features are a result of shortcuts, patch jobs, bugs, unintended mistakes and all out cock up's?)

      Mate, if you could take a game you released 11 years ago and get it running on a new platform with the ease you describe, you would be some kind of genius time wizard, not a game developer.
      Shit, didn't we just have a story recently about an online game that accidentally left something in animation tied to framerate, resulting in more damage taken by players using the 60fps option?

      I dunno, I'm not a game developer...

    Who cares, if it takes off we might actually get an Okami 2 or something from that universe on modern systems!!
    (Though, they will prob lock that too because Capcom)

    So why are they not using the original GC version, or the PS2 version as a base. Why use the version made for the notoriously shitty to program for PS3?

      GameCube? You mean Wii, right?

      Okami didn't came out on the GameCube.

        Correct. For some reason I had lumped Okami into the Capcom 5 set of games for the cube.

          Sadly it isn't, but that does remiind me that we need to seriously push Capcom to get the best Capcom 5 game, P.N.03 ported to something modern.

    Insta no-buy. Maybe, maybe a buy if and when there is a FOV mod. Maybe. Otherwise I might as well emulate my PS3 copy in RPCS3. That's 4k anyway.

Join the discussion!

Trending Stories Right Now