As commenter Duncan raises, I understand there are serious worries that things look TOO good for gamers in the results of the Interactive Australia 2009 Report. But the report's author, Jeffrey Brand, flagged the methodologies at the outset of today's launch and also discussed during the presentation how earlier concerns of bias could now be laid to rest as the trends from past reports (2005 & 2007) to this are very similar and in fact increasingly favourable which would make sense.
That said, I think we will still have to worry about those politicians who decide based on emotional lobby efforts instead of hard statistical evidence. Numbers like these can't help but look "wrong" if you still believe that games are simply toys for kids.
The money quote from Dr. Brand after the initial discussion of the methodology: "This is big audience research by any measure."
A more detailed run down of the methods, samples and statistical foundations after the jump. Oh, and one more sweet stat... 68% of all respondents identified as playing computer or video games. That's more the two in every three Australians.
UPDATE: Sample and methods details updated to reflect the much more detailed information supplied near the end of the report.
Level 8. Methods
Stage 1. Introduction Interactive Australia provides data on who is playing games in Australia, what gamers' attitudes and behaviours are like compared with those of non-gamers, the nature of the games market, the importance of games in the family experience and the role of online access in game purchasing and play. The study is based on a national random sample of 1614 Australians who responded to more than 75 questions and over 300 data points in a 20-minute online survey run by ACNielsen Surveys Australia. Two units of analysis are explored in the study: the household and the player individual within the household.
Stage 2. Sample The target sample was set at 1600 households in order to provide sufficient sample size to attract game and non-game households in sufficient numbers for statistical comparison. IA9 is based on a national random sample of 1614 households in July 2008. Multiple units of analysis are explored in the study: the household (n=1614), and all individuals within game households (n=4671) plus the participant adult from households without a game device (n=181). Of the 4852 individuals studied, 3162 (68%) were identified as gamers. A game household was one which had in it any device for playing a computer or video game, excluding mobile phone, smart phones/PDAs. A gamer was a person who indicated they play computer or video games, simply "yes" or "no." The response rate was 88%, demonstrating a very high effectiveness of the Your Voice Panel. The margin of error is ±2.4% for the national sample comparing all households and ±1.8% for all gamers.
Stage 3. Methodology
SURVEY CONTENTS The national survey included 10 sections. The first section explored media and non-media leisure and was completed by all participants. The second section explored Internet and online access (all participants). The third section asked about the presence of game devices within the household (all participants) and was used to identify game from non-game households. The fourth section asked questions about the participants' own game play behaviours and attitudes (game households only). The fifth section asked about all other household members and determined game-play status of each and the game play habits of one who plays most. The sixth section asked about classification knowledge and attitudes (all households). Section seven asked questions about parents' monitoring and other uses of games with their children (game households only, parents only). Section eight asked questions about game collections in the home and about game piracy (game households only). Section nine assessed attitudes toward games, their role in society and the impact of interactivity on media experiences (all households). The last section asked demographic questions including age, education, work status, religiosity, and income.
CONDUCT OF SURVEY The national random sample telephone survey was designed by a team at the Centre for New Media Research at Bond University. It was designed to take the pulse of gaming in Australia and allow comparisons between game and non-game households, game-players and non-game players. The survey was conducted in July 2008 by ACNielsen Australia in Sydney. The online survey used ACNielsen's "Your Voice Panel" which draws from 80,000 Australian households with representation in every state and territory. The Your Voice Panel is structured to be representative of the Australian population with the exception of online access by which surveys take place. Panellists volunteer for and are recruited to be part of the panel and are included in a benefit structure in exchange for their time and quality responses to social and market surveys.
ACNielsen is the world's leading provider of information and research to the consumer products and services industries. It has offices in more than 100 countries worldwide. In Australia, ACNielsen has been collecting and analysing information on consumer attitudes and purchase behaviour for more than 50 years.
DATA ANALYSES ACNielsen provided the CNMRE with raw data from the survey for statistical analysis at the CNMRE. The data were analysed by study authors Jeff Brand and Jill Borchard. The data were analysed using SPSS V15 in the Windows operating environment. Statistical procedures included simple descriptive statistics such as frequencies, cross-tabulations, means and tests of significance such as Chi-square and One-way ANOVA. For the purposes of including results for all members of a given household, the Vars-to-Cases procedure was used to create individual records for all persons identified by the participants in the study.
Data reduction procedures included reducing range for some questions to simplify presentation of responses. Some measures were combined into indices where obtaining a frequency or mean across a combination of measures simplified the presentation of findings. Missing values were eliminated for analysis on a per-question basis unless multiple measures were examined conjointly. For these, the case-wise deletion method was applied.