Brain-training programs like the ones offered by Luminosity and LearningRx claim to boost intelligence and even offset the effects of ageing. A re-evaluation of the existing scientific literature on the matter shows that these claims are complete garbage.
Illustration by Jim Cooke.
A new study published in Psychological Science in the Public Interest concludes that brain-training games don’t work as advertised, a finding that could make life more difficult for companies that develop such programs. Earlier this year, the US Federal Trade Commission fined Luminosity nearly $US2 million ($2.6 million) for making false claims about the effectiveness of its product, and for failing to produce supporting science. This latest study represents another serious setback for the company and others with similar offerings.
Back in 2014, a consensus statement published by more than 70 scientists claimed that brain games “do not provide a scientifically grounded way to improve cognitive functioning or to stave off cognitive decline”. Several months later, a group of 133 scientists put out their own statement, claiming that the scientific literature is filled with examples showing the benefits of brain training for a variety of cognitive tasks and everyday activities.
As University of Illinois at Urbana-Champaign psychology professor Daniel Simons and his fellow researchers rightfully asked in the preface of the new study, “How could two teams of scientists examine the same literature and come to conflicting ‘consensus’ views about the effectiveness of brain training?”
To find out what’s going on, the researchers re-investigated the existing literature, pulling up over 130 published, peer-reviewed, scientific studies that are typically cited by brain-training companies. The researchers went through each paper, scrutinising the evidence and evaluating factors such as sample size and the presence of control groups. Very few of them passed these sniff tests.
“Based on our extensive review of the literature cited by brain-training companies in support of their claims, coupled with our review of related brain-training literatures that are not currently associated with a company or product, there does not yet appear to be sufficient evidence to justify the claim that brain training is an effective tool for enhancing real-world cognition,” conclude the authors in the study.
Not surprisingly, brain-training can improve performance on the particular task or puzzle that’s being trained for. But there was very little evidence to show that these brain-games extend beyond that. These programs simply don’t improve everyday cognitive performance.
“It’s disappointing that the evidence isn’t stronger,” noted Simons in an NPR article. “It would be really nice if you could play some games and have it radically change your cognitive abilities. But the studies don’t show that on objectively measured real-world outcomes.”
Keep that in mind the next time you’re tempted to click on one those ubiquitous Luminosity ads.
[Psychological Science in the Public Interest via NPR]
This story originally appeared on Gizmodo
Comments
6 responses to “Brain Training Games Might Not Have Any Real World Benefit, Study Finds”
Title is misleading.
The paper is a literature review, they haven’t actually performed their own studies to prove their hypothesis.
Additionally they only discredit the studies performed in the literature reviewed, which is very different to proving the antithesis.
At best this paper takes us back to a neutral standpoint, where it has not been determined whether or not brain training games have an effect on cognitive ability or degradation.
It does say they conducted their own review of “related brain-training literatures that are not currently associated with a company or product”.
I think the bigger problem is the author seems to confuse absence of evidence with evidence of absence. As mysteryman notes, this review’s conclusion is that the claim of cognitive improvement is unsupported, not that it’s proven false. The title doesn’t match the study’s findings, it’d be more accurate replacing “aren’t” with “might not be”.
Conducting a review and determining the majority of studies aren’t worth the paper they’re written on is completely different from performing a study themselves.
They haven’t proven anything one way or the other, just showing that previous studies are inaccurate.
Fair cop. I’ll fix it up.
Whew, glad I finished work before this happened :p