Brandon Erickson at Gamecritics puts forth an interesting question. Is there a “critical period” in children for learning video game skills, the same as there is for language? It doesn’t preclude the learning of languages, or the innate hand-eye coordination specific to video games, later in life, but it may explain why younger kids who grow up using controllers second nature will, depending on the game, always whip the arse of an adult who learned on that stupid Colecovision phone-looking thing.
Using Super Smash Bros. Brawl as an example, Brandon tells us his brother, with little current gaming experience, utterly thrashed him. On a more complex game like Call of Duty 4, Brandon had the upper hand.
This makes me wonder if gaming skill operates similarly to language acquisition. Maybe my brother’s early-life exposure to previous Smash Bros. games gives him a built-in advantage that my practice will never overcome. It could be that after age 12 our brains can’t instinctively master certain gameplay styles that we weren’t previously exposed to, hence my suckiness at newer fighting games. I’m not saying older people can’t master new gameplay styles, but rather that there might be a developmental cutoff after which achieving such mastery becomes much harder.
That’s a pretty good question, and I’m disappointed I didn’t think of it first. So I’m linking to it. I’d like to think this hypothesis could be supported by child development experts, if only to level the competitive imbalance with my father, who is 58 and can still kick my arse in driveway hoops. (Thanks for teaching me a jump shot during that “critical period,” Pops.)
Is there a “critical period” for videogame skill acquisition? [Gamecritics.com]