Fallout 4 Won't Be On PS3 Or 360, Bethesda Says

Fallout 4 Won't Be On PS3 Or 360, Bethesda Says

Here's some good news for anyone concerned that last generation's consoles are saddling today's game developers: Fallout 4 is current-gen-only, Bethesda says. No quiet last-gen releases coming down the road.

Some might have assumed as much when the publisher announced Fallout 4 last week for PC, PS4, and Xbox One, but there were certainly gamers -- myself included -- worried they had a quiet last-gen port in the works to maximise sales. Fortunately, that's not happening, according to Bethesda global community lead Matt Grandstaff, who says the next Fallout ain't coming to 360 or PS3.

This current-gen exclusivity is important for two reasons:

1) It means Todd Howard and his team at Bethesda Game Studios will be able to take advantage of the expanded memory limits on today's hardware, which is especially important for an open-world game like Fallout, where everything -- from interactive objects to building interiors -- takes up a ton of memory.

2) It totally debunks that dumb Reddit rumour, just in case our debunker wasn't enough for ya.


Comments

    2) It totally debunks that dumb Reddit rumour, just in case our debunker wasn’t enough for ya.

    You know there's ways to discredit a rumor, and ways not to discredit a rumor, I'm glad lately Kotakus sticking with the bitchy, worst way to discredit a rumor. Consistency is key, after all, even if that consistency is awful.

    Also, Kotaku, seriously, get a better writer, this guy is just the worst. His negative, bitchy attitude is putting me and a lot of others off the site. It's a sad fact you go from highs like Mark Serrels and Junglist, to a Current Affair level pap like this.

    Just incidentally, a question such as "Was Fallout 4 *ever* *at any point* considered for 360 or ps3?" would have been better too, because you'll probably find they did consider it at some point, to some degree before deciding to move on to current gen. It's not impossible, but that single comment doesn't debunk it given how old that Reddit thread was.

    Last edited 08/06/15 5:10 pm

      You forgot Plunkett.

      I have to admit, even this article got me excited after a whole day of "no-news Monday".

        Stupid queen and her stupid birthday.
        On a completely separate note, when the queen dies and we get a king, does the day the holiday is on change and do we call it the kings birthday?

          Isn't the current 'queens birthday' the current queens grandmothers birthday or something and they just never bothered to change it? :P

        Hey let us not forget Brian *one sentence click bait* Ashcraft.
        I was a mixed opinion on Jason but now yeah, Mark and Junglist are all that is left. Kotaku is becoming like the Escaptist - without Zero Punctuation there is little reason to go there and there is little reason to go to Kotaku.

      Couldn't have put it better myself.

      Schreier is great for one thing lately, making Kotaku as a whole look like a joke. But hey, if they're all happy to be considered just as 'professional' then so be it.

      You're right! So glad to see Kotaku keeping up their snarky and elitist tone!

      I agree. Things need to improve.

      - reposting is getting out of hand. I have seen articles reposted 3 or 4 times over a few days.
      - Buzz feed style posts are getting out of hand.
      - Some posts are lacking any real substance, some even containing just quotes/links without any input whatsoever. This happens on a weekly basis.
      - This site has a hard on for gifs. while this is a smaller problem some posts can contain over 10 gifs making loading painful on any connection and the layout unprofessional.

      Once you have gotten through some of these "hurtles" and found a new article that is not a repost etc, you have the above problems weresmurf said.

      I don't expect to have my mind blown every time I click on a story, but I do want something worth reading.

      Last edited 08/06/15 11:44 pm

      Thats journalism in Australia these days. Instead of giving us that facts and letting us decide our opinion, they tell us what we should be thinking. How else would muppets like Abbott get into power?

        I'd love to blame it all on the media, but surely at least some of it comes down to the quality of the opposition (Labor was still the lesser of 2 evils sure, just not quality evil) and the retarded voting public.

        This is Abbott-solutely unacceptable.

      It'd be great if they were semi-self aware or had a shred of dignity, wouldn't it?

    At this point in time I'm not expecting anymore games to be released on the 360 or PS3. I was surprised to learn that Mortal Kombat would be.

      Hasn't MK on 360/PS3 now been delayed to TBC?
      Interpreted as 'no chance'.

      That's optimistic of you. In an industry allergic to "risk" and also the prospect of building for the future, I'm entirely expecting more games to be announced for 360 and PS3.

      The sales say it all as well look at the sales split for Destiny as a good example. New gen got 82% of sales versus 18% on last gen.

      Yes there's lots of last gen out there but people aren't buying games on them when they have a xboxone/ps4 to use instead.

    Personally don't believe that ANYTHING should be developed for 360/PS3 at the expense of current gen technologies. My opinion is games should be created FIRST AND FOREMOST for PC's and then dumbed down as required for PS4/Xbox One consoles so they can handle it. I'm getting rather sick and tired that the trend has changed to "oh lets make a game for X console, then we'll port it to PC and put zero effot into it". Having been gaming since the late 80's this change in attitude from developers is really starting to piss myself and a lot of other people off in the gaming community. Not making games for 360/PS3 is just showing how slowly consoles in general move with the times, and how they are, almost singlehandedly, always the ones responsible for holding graphics and performance back from the rest of the industry. Consoles will always have their place, but it should NEVER be at the expense of the best possible gaming experience on a PC.

      I'd rather a console game than an "nvidia" game that runs like shit on any other graphics card. You say made for console, dumbed down for PC with zero effort, I say made for nvidia (the logo even says as much), dumbed down for everything else with zero effort.

        It's hardly Nvidia's fault that AMD provides terrible support for in-development titles. The 'Made for Nvidia' logo is no different to the 'Made for AMD' logo, they're both part of their respective companies' development support programs. Neither of them prevent the devs from optimising for the other, the reason you see the Nvidia logo more frequently is because they're more proactive about getting out there helping devs. The situation with Project CARS is a prime example.

          Exactly ZombieJesus which is why I no longer care for AMD since being loyal from 2000 onwards and now just accepting paying more for an nVidia card from my next upgrade onwards. Pay more - have it actual freaking work.

            yep i havent used an AMD card since 2009. too many errors and bad FPS using AMD even though they benchmark better Nvidia. I remember playing New Vegas with an AMD card and there was actually a set of bushes near the repcon facility that would not show up. it wasnty until i switched to nvidia card that i finally saw the busshes

              I like how we get voted down but even AMD's CEO has come out 2 years ago saying we don't get drivers right for games fast enough and are loosing marketshare on top of bad crossfire scaling, stuttering frames etc.
              I cheered thinking they are going to go all out and get things right. The eyefinity fixes, stuttering, and driver releases never came or slowed more.
              Today I purchased two GTX980ti's. Sorry but I want to enjoy Witcher 3 and Project Cars now not in 3 years MAYBE.

          It's kind of surprising really: if the developer is going the multi-platform route, you'd expect them to focus on getting it working well on Radeon GPUs, given that's what you find in the PS4, Xbone and Wii U.

          That Nvidia still manages to thrive isn't a particularly great endorsement for AMD ...

            Yeah when the 3 current consoles got announced I was like YES AMD, this is going to so benefit my two 7970s but noooope, things got worse. I hate the ethical team Red loosing out to the Green team but facts are facts.

          Things like nVidia GameWorks are just that though: a way of indirectly preventing companies from optimising for AMD. At least when AMD makes a novel graphics tech it is allowed to run uninhibited on nVidia hardware, such as tressFX.

          By what the PC devs have said and the huge improvement DX12 will have in PC for AMD, the issue with AMD in a few modern games is the relatively clunky way in which it handles draw calls under DX11 when compared to NV. This is because AMD put their development into Mantle, which while being very good for the industry didn't do them any favours. NV on the other hand optimised for DX11 more, which has helped them in situations like PC but was ultimately a much more selfish move. Sadly in business (and for AMD) it pays to be selfish even if it means on the whole more people get a worse deal.

          Last edited 08/06/15 9:35 pm

            Gameworks doesn't prevent anything, that's nothing more than a conspiracy theory. Plenty of devs have commented over the years that Nvidia doesn't lock you in to anything, they simply offer help, while AMD does so much less frequently. Nvidia's proprietary features are all optional additions that don't affect the core game performance. There has never been anything stopping a game including Nvidia and AMD features except lack of time and lack of help.

            Hairworks doesn't work well on AMD cards because AMD cards are bad at tessellation, for the record, not because Nvidia did anything proprietary to lock the API down to their cards. TressFX works somewhat poorly on Nvidia cards because Nvidia cards have weaker OpenCL support than AMD. Tessellation is a critical technology in DX11 onwards and a lot more important than OpenCL in graphics processing, it's very much AMD's problem if their cards aren't handling it well.

            AMD's investment into Mantle wasn't a good move in most respects. For developers, the low-level way it interacts with hardware harkens back to the old days where the game, not the platform has to be responsible for detecting and enabling or disabling graphics features. Sure it's faster, but speed isn't the most important thing developers need, abstraction is. Without adequate abstraction, modern games are much harder to make. This is one of the reasons so few studios have taken up Mantle, because DirectX does a vastly better job at abstraction in exchange for a relatively small performance hit.

            From a business perspective, investing so heavily in Mantle was a terrible idea. DirectX is well established, even OpenGL has taken decades to carve out the small percentage of market share it has. On top of that, AMD is a hardware company, graphics platforms are a software solution. It was a poor move trying to crack into a market that other companies have significantly more expertise in when AMD's customer base expects hardware performance from them first and foremost. Especially given how bad a reputation AMD has for drivers, there's very little confidence in Mantle in the market.

            DirectX 12 is good for everyone. I wouldn't expect a silver bullet for AMD out of it, but hopefully it reminds them of where their core business lies and encourages them to focus on hardware performance improvements. Right now they're 12 months behind on fabrication technology, and they run too hot and draw too much power. They need to get back to competing properly in hardware, so that there's enough competition between them to keep the industry healthy for the rest of us consumers.

          You are right, its not nVidia's fault at all, nor is it AMD's fault. Its clear AMD doesn't have the features that nVidia does, the problem is that developers put too much reliance on things like PhysX and the fantastic hair that nVidia does.

          I know its oversimplifying it, but where is the simple check "if nvidia, draw fancy hair, else dont". This isn't nVidia or AMD's fault, its the developer not optimising their game and AAA titles are no exception at all. PC is and always will be an open platform, if you're too lazy to optimise your game to work on high-end competitor cards then you should be developing your title for console where the hardware is fixed.

      Anybody who doesn't spend $1000 on a video card every year shouldn't be allowed to play PC games, that way we can have the cutting edge elitist Utopia we've all dreamed of.

      There aren't a lot of Beasts capable of the upper limits of resolution and graphic fidelity out there. PC's are not a big enough market anymore to justify leaving the consoles behind, especially with games costing as much as they do. Sharing the burden between consoles means you get games. And if you raise the bar too high games either have to become far more expensive because fewer people can run them or you don't get any.

        That's a bit of an alarmist view. There are plenty of studios and titles that are PC exclusive that do just fine financially. Catering for PC only won't make games vanish or anything so drastic, it's just more appealing to target all available platforms to maximise profit.

          While there are many PC only franchises out there, How many require a top of the line PC? Consider how Red Dead Redemption wasn't considered worth it for a PC port.

      My opinion is games should be created FIRST AND FOREMOST for PC's and then dumbed down as required for PS4/Xbox One consoles so they can handle it. I'm getting rather sick and tired that the trend has changed to "oh lets make a game for X console, then we'll port it to PC and put zero effot into it".

      That's always and still is the case. Build on a PC first, test on a dev kit then use the same build everywhere because the big wigs have put unrealistic costs and expectations on the developers.

      Not making games for 360/PS3 is just showing how slowly consoles in general move with the times, and how they are, almost singlehandedly, always the ones responsible for holding graphics and performance back from the rest of the industry.

      Unless the consoles become sentient and write code instead of programmers your claim will never be valid.

      It never has been the fault of consoles it is either the developers that are too lazy to squeeze out performance on a given platform or have been hamstrung by publishers and/or parent companies that expect CoD style profits for minimal cost.

      At the end of the day, PCs are just one platform and if you're not happy then stay away from multi-port games.

      Oh wait, the idea of exclusivity (as well as good programming and strong game design) are all but dead now. Nevermind.

      Last edited 09/06/15 7:36 am

      witcher3.exe has stopped working. Click here to report the problem to Microsoft.

      I like the 'PC first, port to console' idea, but it doesn't work. It's a lofty idea, but catering to the lowest common/console denominator is more than just about texture resolution and anti-aliasing.

      Having potentially top-of-the-line power at your disposal causes you to be lazy with your optimization AND your design. Designing to improve graphics on 8yr old consoles has seen the tail end of last-gen produce some really interesting techniques. Techniques they had to come up with to do more with less. If they hadn't faced those constraints, they'd have designed around brute-forcing... which would've made console ports impossible.

      The legacy of the console-first design is not a pretty one. Catering to consoles is what has brought us reduced-draw-distance fog, disappearing corpses, fewer-poly-on-screen cramped hallways map-design, checkpoint systems instead of manual saves, number of enemy AIs able to be encountered at once... all design things you can't easily just 'patch in' to your console port. If developers were able to achieve everything they were hoping to relying on the relative power available on a PC WITHOUT a thought toward console limitations, they'd frequently have to virtually redesign the entire work if they wanted to port it down. Which would mean nearly double the cost. Not worth it when you can work the other way.

      That's why designing to the lowest console denominator is the norm, and PC porting is farmed out to contractor studios who either phone it in and leave gamepad control prompts in the tutorial tips, or simply pack in a few extra gig of only-slight-scaled-down textures.

      Last edited 09/06/15 10:48 am

        It was fine in the 90's though. Starcraft, Diablo, Doom all got Sony and Nintendo ports from PC to name a few of many people have forgotten about.
        Though to me it is all bullshit. Whichever direction you go just get it right for all platforms you port too i you are going to take peoples money.
        All this million configs of PC is rubbish. For years both GPU makers use the same chips for a generation or three based on DX just of varying power (eh nVidia Maxwell or AMD Tahiti) and AMD and intel have made x86 instruction based chipsets for donkey's years.
        All it is, is a case of AAA dev's going, PC team this many fewer members cause this many % will be the PC sales vs the console sales. Yet PC sales will lag with early adopters of any title if the title is a crap port (ala AC Unity) so you won't get more sales without spending money.

    No quiet last-gen releases coming down the road.
    What were you expecting? "Fallout 4: Not-so-Good Edition"

    It doesn't debunk the Reddit rumour, at all. It's entirely reasonable to think that a year ago Bethesda may have been considering a PS3/Xbox 360 port.

      I suspect when they started production on this, it was most likely on their plate. Probably got descoped down the track so they could focus on 3 platforms only.

        Exactly, but that wouldn't make for a good bitchy rant from kotaku now would it?

    I think abandoning so called 'old gen' consoles is a stupid move. There are far more PS3's and XBOX360's in the wild than PS4's and XBOX1's.

Join the discussion!