Quantcast
Channel: The Invisible Gorilla » cause
Viewing all articles
Browse latest Browse all 9

Think video games make you smarter? Not so fast…

$
0
0

Updated on 9/13/11 at 4:50pmMinor tone/wording update to the conclusion and a little more detail on alternative explanations for correlational results after the break.

Try to spot the flaw in this study. A scientist recruits a group of subjects to test the effectiveness of a new drug that purportedly improves attention. After giving subjects a pre-test to measure their attention, the experimenter tells the subjects all about the exciting new pill, and after they take the pill, the experimenter re-tests their attention. The subjects show significantly better performance the second time they’re tested.

This study would never pass muster in the peer review process—the flaws are too glaring. First, the subjects are not blind to the hypothesis—the experimenter told them about the exiting new drug—so they could be motivated to try harder the second time they take the test. The experimenter isn’t blind to the hypothesis either, so they might influence subject performance as well. There’s also no placebo control condition to account for the typical improvement people make when performing a task for the second time. In fact, this study lacks all of the gold-standard controls needed in a clinical intervention.

Walter Boot, Daniel Blakely and I have a new paper that just appeared in Frontiers in Psychology this week (link) that argues for similarly serious flaws in many of the studies underlying the popular notion that playing action video games enhances cognitive abilities. The flaws are sometimes more subtle, but they’re remarkably common: None of the existing studies include all the gold-standard controls necessary to draw a firm conclusion about the benefits of gaming on cognition. When coupled with publication biases that exclude failures to replicate from the published literature, these flaws raise doubts about the mere existence of a benefit.

The evidence in favor of a benefit from video games on cognition takes two forms: (a) expert/novice differences and (b) training studies.

The majority of studies compare the performance of experienced gamers to non-gamers, and many (although not all) show that gamers outperform non-gamers on measures of attention, processing speed, etc (e.g., Bailystock, 2006; Chisholm et al., 2010, Clark, Fleck, & Mitroff, 2011; Colzato et al., 2010; Donohue, Woldorff, & Mitroff, 2010; Karle, Watter, & Shedden, 2009; West et al., 2008). Such expert/novice comparisons are useful and informative, but they do not permit any causal claim about the effects of video games on cognition. In essence, they are correlational studies rather than causal ones. Perhaps the experienced gamers took up gaming because they were better at those basic cognitive tasks. That is, gamers might just be better at those cognitive tasks in general, and their superior cognitive skills are what made them successful gamers. Or, some third factor such as intelligence or reaction times might contribute to interest in gaming and performance on the cognitive tasks.

Fortunately, few researchers make the mistake of drawing causal conclusions from a comparison of experts and novices (although media headlines do occasionally make that mistake). We argue, though, that it’s not even clear that there are real differences between gamers and non-gamers on these basic cognitive measures. The reason is that most studies suffer from the same problem as the attention-drug thought experiment I described at the start of this post. Experts in these studies are recruited because they are gamers (novices are often recruited because they are non-gamers), meaning that the subjects have a reason to suspect that their game experience is relevant to their performance in the experiment. Many gamers are familiar with claims in the media and in the scientific literature that gamers outperform non-gamers on cognition and perception tasks. Consequently, they are highly motivated to perform well. They are akin to a drug treatment group that has been told how wonderful the drug is. In other words, they are not blind to their condition. When the tasks are at least somewhat similar to the games they’ve played, they might well figure out the hypothesis of the study. The only way around this motivation effect is to recruit subjects with no mention of gaming and only ask them about their gaming experience after they have completed the primary cognitive tasks, but only a handful of studies have done that. And, even when they have done so, gamers might still be more motivated to perform well than novices because they are asked to perform a game-like task on a computer. In other words, any expert-novice differences might reflect different motivation and not different cognitive abilities.

Even if we accept the claim that gamers outperform non-gamers on cognitive tasks, such differences do not permit the conclusion that gaming affects cognition. The only way to do that is to use a training intervention, much like a clinical trial, in which non-gamers receive game experience and the experimenter measures improvements in cognitive abilities following training (e.g., Green & Bavelier, 2003; 2006a; 2006b, 2007). Training studies are far more expensive and time-consuming to conduct, and only a handful of labs have even attempted them. And, at least one large-scale training study has failed to replicate a benefit from action game training (Boot et al., 2008; see also Irons, Remington, & McClean, 2011 and Murphy & Spencer, 2009 for cross-sectional failures to replicate). In our Frontiers paper, we discuss a number of concerns about these training studies, which, taken together, raise serious concerns about the validity of the claim that games benefit cognition:

  1. The studies are not double-blind. The experimenters know the hypotheses and could subtly influence the experiment outcome. Such experimenter bias effects are likely to be small, though.
  2. The subjects themselves might not be entirely blind to the hypothesis. In a drug study, a placebo condition is considered to be inadequate if subjects can figure out whether they are in the experimental or the control condition. For example, if the drug has nasty side effects but a placebo does not, the subjects can guess that they are receiving the drug (as can the experimenter), foiling the purpose of having the placebo baseline condition. In video game training studies, subjects might well see a connection between their training condition and the cognitive tasks. If so, motivation effects could creep back into the results.
  3. Those training studies that find a benefit of game training often show an unusual pattern in which the control group shows no improvement at all from pre-test to post-test. One of the most consistent findings in the literature on learning and practice is that people do better with practice. The control group should show improvement from the pre-test to the post-test. And, if game training affects cognition, the improvement should be bigger in the experimental condition. When the control condition shows no improvement the second time they do a task, we should be worried that the control group is somehow inadequate. Almost all of the published training studies showing a benefit of video game training relative to a control group show no test-retest effect in the control group. The studies failing to replicate a training benefit typically show the expected test-retest improvement in both groups, but no selective benefit for video game training. That raises the concern that the “action” in these studies comes not from a benefit of action game training but from some unusual cost in the control condition.
  4. It is not entirely clear how many independent replications of training benefits actually exist in the literature. There are now a number of papers showing that 10, 30, or even 50 hours of training produces benefits on one or two reported outcome measures. When conducting a large-scale training study, it’s typical to test a large battery of outcome measures because it would be prohibitively expensive to do all of that training with just one chance to find a benefit. Yet, many of the papers touting the benefits of training for cognition only discuss the results of one or two outcome measures, leaving open the possibility that the same subjects actually were tested on many unreported outcome measures. Moreover, based on the game scores noted in some of these papers, it appears that data from different outcome measures with some of the same trained subjects were reported in separate papers. That is, the groups of subjects tested in separate papers might have overlapped. If so, then the papers do not constitute independent tests of the benefits of gaming. Unfortunately, these details are underreported in the manuscripts, so there might be far fewer independent replications of the benefits of gaming. Together with the known failures to replicate training benefits and possible file drawer issues, it is unclear whether the accumulated evidence supports a benefit of training at all.

Given that expert/novice studies tell us nothing about a causal benefit of video games for cognition and that the evidence for training benefits is mixed and uncertain, we argue that the enthusiasm for video game training as a cognitive elixir needs to be reigned in. In some ways, the case that video games can enhance the mind is the complement to recent fear mongering that the internet is making us stupid. In both cases, the claim is that technology is altering our abilities. And, in both cases, the claims seem to go well beyond the evidence. The cognitive training literature shows that we can enhance cognition, but the effects of practice tend to be narrowly limited to the tasks we practice (see Ball et al., 2002; Hertzog, Kramer, Wilson, & Lindenberger, 2009; Owens et al., 2010; Singley & Anderson, 1989; for examples and discussion). Practicing crossword puzzles will make you better at crossword puzzles, but it won’t help you recall your friend’s name when you meet him on the street. None of the gaming studies provide evidence that the benefits, to the extent that they exist at all, actually transfer to anything other than simple computer-based laboratory tasks.

If you enjoy playing video games, by all means do so. Just don’t view them as an all-purpose mind builder. There’s no reason to think that gaming will help your real world cognition any more than would just going for a walk. If  you want to generalize your gaming prowess to real-world skills, you could always try your hand at paintball. Or, if you like Mario, you could spend some time as a plumber and turtle-stomper.

Citation for our Frontiers in Psychology article:

Boot WR, Blakely DP and Simons DJ (2011) Do action video games improve perception and cognition? Front. Psychology 2:226. doi: 10.3389/fpsyg.2011.00226.  Link to Full Text

Other Sources Cited:

  • Ball, K., Berch, D. B., Helmers, K. F., Jobe, J. B., Leveck, M. D., Marsiske, M., et al. (2002). Effects of cognitive training interventions with older adults: A randomized controlled trial. JAMA: Journal of the American Medical Association, 288(18), 2271-2281.
  • Bialystok, E. (2006).  Effect of bilingualisim and computer video game experience on the simon task.   Candadian Journal of Experimental Psychology, 60, 68-79.
  • Chisholm, J.D., Hickey, C., Theeuwes, J. & Kingston, A. (2010) Reduced attentional capture in video game players.  Attention, Perception, & Psychophysics, 72, 667-671.
  • Clark, K., Fleck, M. S., & Mitroff, S. R. (2011). Enhanced change detection performance reveals improved strategy use in avid action video game players. Acta Psychologica, 136, 67-72.
  • Colzato, L. S., van Leeuwen, P. J. A., van den Wildenberg, W. P. M., & Hommel, B. (2010). DOOM’d to switch: superior cognitive flexibility in players of first person shooter games. Frontiers in Psychology, 1, 1-5.
  • Donohue, S. E., Woldorff, M. G., & Mitroff, S. R. (2010). Video game players show more precise multisensory temporal processing abilities. Attention, Perception, & Psychophysics, 72, 1120-1129.
  • Green, C. S. & Bavelier, D. (2003). Action video game modifies visual selective attention. Nature, 423, 534-537.
  • Green, C.S. & Bavelier, D. (2006a). Effect of action video games on the spatial distribution of visuospatial attention. Journal of Experimental Psychology: Human Perception and Performance, 1465-1468.
  • Green, C. S. & Bavelier, D. (2006b). Enumeration versus multiple object tracking: the case of action video game players. Cognition, 101, 217-245.
  • Green, C.S. & Bavelier, D. (2007). Action video game experience alters the spatial resolution of attention. Psychological Science, 18, 88-94.
  • Hertzog, C., Kramer, A. F., Wilson, R. S., & Lindenberger, U. (2009). Enrichment effects on adult cognitive development. Psychological Science in the Public Interest, 9, 1–65.
  • Irons, J. L., Remington, R. W. and McLean, J. P. (2011), Not so fast: Rethinking the effects of action video games on attentional capacity. Australian Journal of Psychology, 63: no. doi: 10.1111/j.1742-9536.2011.00001.x
  • Karle, J.W., Watter, S., & Shedden, J.M. (2010).  Task switching in video game players:  Benefits of selective attention but not resistance to proactive interference.  Acta Psychologica, 134, 70-78.
  • Murphy, K. & Spencer, A. (2009). Playing video games does not make for better visual attention skills. Journal of Articles in Support of the Null Hypothesis, 6, 1-20.
  • Owen, A.M., Hampshire, A., Grahn, J.A., Stenton, R., Dajani, S., Burns, A.S., Howard, R.J., & Ballard, C.G. (2010).  Putting brain training to the test.  Nature, 465, 775-779.
  • Singley, M. K., & Anderson, J. R.  (1989). The transfer of cognitive skill.  Cambridge, MA.: Harvard University Press.
  • West, G. L., Stevens, S. S., Pun, C., & Pratt, J. (2008). Visuospatial experience modulates attentional capture: Evidence from action video game players. Journal of Vision, 8, 1-9.

Viewing all articles
Browse latest Browse all 9

Latest Images

Trending Articles





Latest Images