The Aussie Studio Using Kinect Hacks For Low-Cost Motion Capture

Epiphany Games is all about working smarter, not harder. Its latest tech innovation takes one of the most time consuming and costly aspects of development and turns it more into play than work. Using two Kinects, and some outside-the-Xbox thinking, it can motion capture far quicker and cheaper than bigger studios. I had to check it out.

More and more, I think the best things about the Kinect are things it was never meant to do. I've never been interested in playing a game on it. But hacking it for stuff like this? Way cool.

Epiphany has a room with two Kinects -- one in front of you, one to the left -- recording your movements so that after a short process, they can import them directly as an animation into their game. No need for a big studio, no need to wear special clothing with dots on it, no need to employ staff with special technical training. Just fire 'er up, calibrate, and go.

Before checking out the mo-cap room, they treated me to one level of Epiphany's upcoming game, which should be announced within the next couple of months. It was a nightclub scene in which I was trying to escape a sizeable bouncer named Chuck. Noticeably, the nightclub patrons had a wide variety of animations as they danced -- but when the bouncer caught up to me, I was equalised without taking a blow. In this pre-alpha, there wasn't an animation for a punch yet.

I do amateur boxing, so as we walked into the motion capture area, Epiphany's founder Morgan Lean suggested we grab a few combo animations. Perhaps our massive friend Chuck could provide physical motivation, instead of using his petrifying telekinesis.

The motion capture room is like any normal room, just with the tables and couch pushed up against the wall. After going through the calibration procedure and holding the Jesus pose, I'm told that I'll have a small box to move around in. I’ll also have to go a little slower than normal, as (pending a tech update), the software only captures at 30fps.

After a few combos, slips, rolls, and sidesteps, we have a file the comes in at a surprisingly low 2MB.

"We had a sequence in which a character was jumping out of a 4-story window."

As Art Director Matt Purchase mans the driving seat, Morgan Lean explains what’s going on. “The software has just detected everything as a cloud of pixels, and from there, we’re going to put that into a program where we can tweak everything.”

The system does have limitations. While facing the frontal camera, the camera on my left tracked that whole side -- but there was no camera on the right, meaning it could lose track of my right arm. When it loses track of body parts, it improvises. It’s not good when it improvises. So while I’d normally keep my guard more compact, I tried to make sure Kinect could see my right arm at all times, and they could bring my elbows in later.

There are ways around these limitations, though. For instance, I found myself wondering how verbose action animations would be done when movement is limited to a small square on the ground. The answer is, they do it piece by piece and then stitch it up together.

“We had a sequence in which a character was jumping out of a 4-story window,” said Purchase. “I first recorded the jump, and then the mid-air movement. Then I recorded me hitting the ground, which was a fall from just about this high,” indicating just a few inches. “But when stitched together, it looks like a complete fall.”

Once my cloud of pixels was saved and brought over to another machine, it was turned into a wireframe, and attached to an animation skeleton. By dragging the skeleton’s joints around, Purchase was able to tweak things like making sure my feet stayed on the ground (and, let’s face it, fix errors in my boxing form).

Epiphany also has a very modular way of dressing its characters. Instead of creating each character brand new and importing them into the game, they can very easily swap in/swap out pieces of clothing. A new hat, different jeans, put on some jewelry, and all of a sudden you’ve got a new extra for the scene. Remember what I said about smarter, not harder?

"An animator doing all those moves would have taken about 3 weeks to do."

Putting that tech into play, it’s not long before we’ve got an in-game character repeatedly jabbing. And when I say “not long”, I mean about 30 minutes. So how long would that have taken normally?

“An animator doing all those moves would have taken about 3 weeks to do,” says Lean. “Hiring a normal motion capture studio could also run you $5,000 per day, traditionally, so it’s been a lot cheaper.”

Two Kinects is certainly a lot cheaper than $5k per day. Add in the fact that both these Kinects were bought second-hand at $50 each, and that’s some thrifty tech. It’s allowed things like the nightclub scene I experienced, already with myriad dancing animations, to exist in a game that only began production in March.

That has a run-on effect, too. With the graphical style nailed down, and music already there, and animations already in, Lean knows exactly how that aspect of the game will behave when trying to fit all the puzzle pieces of game development in his head. And if something needs to be changed? It takes a half hour to fix.

“It also does wonders for thing like posture,” says Lean. “When you’re standing still, you’re still using your muscles, just in ways you don’t notice.”

Purchase agrees: “Things like standing around or just sitting down in a very basic way look more realistic, instead of just ragdoll in a chair.”

All things considered, this feels less like work, and more like play. But I’m a newcomer to the technology -- it’s normal for me to be excited. I ask Lean: Is the novelty gone?

“Nope, I never get tired of it,” he says. “I’m still having fun.”


    I'm still waiting for Kinect porn.......

      just improvise man. Life, uh, finds a way.

    Cool stuff Jung. Good to see you're still kicking (punching?) around.

    Are you and or Epiphany able to dislose what capture software and DCC suite(s) they use with their dual kinect setup? I only know of iPi, RGBDemo, Brekel but the photos seem to just show the DCC side in what looks like Motionbuilder? A software specific step outline of the process would be interesting too.

    MrRoboto out.

      Just asked Morgan Lean for you -- he says it's iPi with Motion Builder.

        Nice one. Thanks.

    Be interested to see how the new model kinect will go with this kind of thing.

Join the discussion!

Trending Stories Right Now