Final Fantasy 15 Won’t Be 170GB On PC

Final Fantasy 15 Won’t Be 170GB On PC

While it’s great that Final Fantasy 15 is coming to PC, one of the things that isn’t so great was the headlines warning about the game’s install size.

Shortly after Square Enix and director Hajime Tabata announced that FF15 was getting a PC release, people started to panic about the game’s install size. And that wasn’t through any fault of the press: the figure came from a press release put out by Square Enix themselves, which also listed the pricey GTX 1080 Ti as part of the “recommended specs”.

But in a chat with Tabata at Gamescom, the veteran developer confirmed that the listing of the hard drive space was a “miscommunication”, as was the recommended specs that had gone out to press.

“That was a mistake, actually,” Tabata said through a translator. “That was a communications mistake – something got put in a memo that really shouldn’t have. What that is [the specifications that went out to press] based on the specs that we’re running the demo on today.”

“Again, the final specs for the release version haven’t been fixed yet. There’s a very good chance they can change. Someone put that in there and it got reported as the recommended specs, but that’s not the final fixed version. The fact that it became that number is a communication [error].”

Tabata went on to say that the demo machines were running at 4K using GTX 1080 Ti cards, but the minimum specs would go down “at least as far as the same level as the console edition, maybe looking to going even further than that in future”. “But we’ve not fixed the frame rate at 30 frames at all – it’s just on the 1080 Ti, that’s the maximum that one can do [at 4K], so that’s what it’s doing at the moment.”

It’s a good thing, really. The 1080 Ti is still a $1000-plus card, and with Vega 64 not coming close and cryptocurrency miners helpfully driving the price of GPUs up, few gamers would get to enjoy FFXV at those “recommended” specs.

The author travelled to Gamescom as a guest of NVIDIA.


  • I expect the game won’t be anywhere near as large as on the consoles as they will most likely release it with the intention of it being played on systems that aren’t bad by standards set six years ago.

    I love my Ps4, but it is starting to get a little long in the tooth.

    • Why would targeting more powerful systems result in a smaller download? If anything, I would have thought aiming to support 4K rendering would mean shipping higher resolution textures, resulting in a larger size.

      • Texture compression – not sure how much things have changed since I last read up on it, but between limits on processors, HDD and RAM speeds and capacity, consoles always used to need things in a format which was ‘easy to access’, and thus not so efficient to store – kinda like prerendered cutscenes vs in-engine in previous generations of consoles and game engines.

        • The easiest example of this is the first Titanfall which was designed for toasters (pc included) with absolutely zero compression, which in turn caused a game with very few assets to be 50 (something) gb.

        • That sounds a bit surprising to me. Do you have any references about it?

          Texture compression has been around for ages, and one of the main reasons it is used is that the time spent decompressing is usually shorter than the time saved by reading less of disk. I would have thought this would be just as true on consoles.

          I can believe that modern PCs could handle more complex compression and more efficient algorithms, I can also believe that the higher resolution textures compress better, since they have a similar amount of detail to the lower resolution ones. But it seems surprising that the higher resolution textures would compress smaller than the lower resolution ones do.

          • texture compression also used to only be limited to certain video cards (older engines often had an assortment of settings for if you needed to use 8/16/32bit textures and if texture compression was supported at all by the video hardware) – this is going back quite a few years now though.

            The time saved decompressing vs reading has always been more of a console issue since even the XBOne I think still uses a 5400RPM HDD for stock storage compared to PCs commonly running 7200RPM (this of course ignores any SSDs in the mix, or difference in available RAM access speeds) – hence why some folks can get load time improvements with external drives connected to their console via USB.

            I think due to the span of development time FFXV also uses some prerendered cutscenes – this is another place they could likely save space because, even if they couldn’t switch over to in-engine, PCs would likely support much better video compression which could save a heck of a lot storage-wise!

  • The article onto talks about the minimum requirement of the 1080Ti and says nothing about the 170GB install size.

    • It ONLY talks, not onto.

      Kotaku can you please fix your comment system so I can edit a comment without it being sent automatically for moderation?

    • But in a chat with Tabata at Gamescom, the veteran developer confirmed that the listing of the hard drive space was a “miscommunication”, as was the recommended specs that had gone out to press.

      It does actually. Not nearly as in depth though.

  • small correction: 1080ti is actually around $600-$700. you might be thinking of the Titan Xp. Those are $1000+ cards.

    • Correction to my earlier correction: 1080ti is actually Start at $700. The link you posted in the article are all grossly overpriced. Check for more sane pricing on cards that are actually in stock. 1080TI in stock starting at $719. And they would actually be cheaper if not for the current GPU market due to the mining craze.

  • I strongly disaprove of the empatis some companies place of using nvidia specific technologies since they are frankly a bad actor…

    There is to little competition and fraily little inovation mostly there seams to only be patent trolling and lock in..

    Nvidia is famously bad at working towards open standards and technologies as illustrated by then bying up physX and then hording that tech to themselves… thus rendering it almost useless since developers cant rely on it being avalible.

    Frankly at this point i dont think any developer has any excuse not to use the Vulkan API.

    Doom4 clearly shows it can deliver the killer performance and that it’s plenty powerful enught it also illiminates the need for a lot of driver overhead which helps illiminate the disadvantage amd has reguarding driver overhead.

    that would help competition and without competition there can only be stagnation and price gouging… another one of nvidia vices.

    It’s sad AMD did not produce a GPU that was more competative this turn round but their CPUs where a bit advancment and they are the underdog in both those races.

    Frankly i which there was some mechanism for abolishing patent protection if sufficent innovational momentum is not maintained.

Show more comments

Comments are closed.

Log in to comment on this story!