Aside from the work that’s gone into the first episode of the Final Fantasy 7 Remake, Square has done a ton of R&D into the technology behind the game. How much time exactly? Several years, according to the game’s producer.
Yoshinori Kitase, the director on Final Fantasy 7 Remake and the director on the original Final Fantasy 7 and Final Fantasy 8, explained in a preview session that Square had built a lot of systems behind the scenes to handle dynamic audio, facial movements and camera control. The idea was to automate some of the small things in the process that animators, artists and designers have to deal with, freeing them up for tasks that can’t be automated.
How long did it take to build all of this? “I can’t give exact numbers here, but it’s about the same kind of time it would take to develop a single game in its own right, so a good several years,” Kitase said through his translator.
“It’s actually not a difficult thing to convince upper management about at all and the reason for that is, because OK there’s an initial cost in developing that and doing the R&D on these AI systems, but after they’re created they actually save on costs because it’s automating stuff that artists and creators would have had to do labouriously stage by stage by hand before,” Kitase said. “When we have these systems, it’s actually in future we can use that on other projects and you’ll actually save a lot of time and effort and man hours, so upper management quite like these kinds of things.”
If you’ve played Final Fantasy VII, you’re probably familiar with the Air Buster, a floating robot who resembles a vacuum cleaner. Air Buster, the game’s second boss, takes roughly two minutes to kill. He’s more of a tutorial to enemy weak points than a genuine threat (hint: use the Bolt spell). So it’s more than a little jarring to see the Final Fantasy VII remake turn him into a superweapon.Read more
The tech isn’t built directly by the Final Fantasy team, though. Similar to DICE’s engineering team that work exclusively on Frostbite, who support various EA studios through different stages of development, Square has a separate R&D department with their own pipelines.
“They’re generally always working on this new kind of technology, when they’ve completed a project they’ve got something up and running like that, they’ll hand that over to the individual development teams to use. So it’s not like we’re using our own development time,” Kitase said, adding that Square Enix has a dedicated sound team “outside of the game development pipelines” that contribute to ongoing projects.
He added that the base version of Unreal Engine wasn’t sufficient for what the team needed, so they added some “heavy customisations” of their own, including new shader tech and effect rendering technology, as well as the in-house AI camera and facial animation systems.
“It will dynamically pick up on that, use AI to work out where the emotional emphasis is on the different words in those languages, and then adapt the facial expressions to fit with that naturally … depending on the voice data it’s picking up on it will make them seem angry or sadder of wistful and it can really can adapt that to set us up in the mood,” Kitase said.
You can see some of the tech in action for yourself through the FF7R demo, which went live on PSN on Monday night.