In a learning test, people who play video games performed better than those who don’t, and their brains appeared to be more active in regions linked to learning and memory.
So concludes a study from Ruhr-University Bochum in Germany that was published in the journal Behavioural Brain Research.
The researchers explain that recent studies have suggested that playing video games may benefit cognition. However, the brain mechanisms involved are poorly understood.
They focused on “a widely unexplored area in gaming research” called “probabilistic category learning.” This type of learning concerns acquiring and classifying knowledge and using it to predict future events.
A traditional way of testing probabilistic category learning is the so-called weather prediction task, which researchers use to gain “insight into implicit forms of learning, cognitive flexibility, and the use of feedback signals in the brain.”
For their investigation, the team recruited 17 video gamers and 17 non-gamers. They recruited the video gamers on the basis that they spent at least 15 hours per week playing action-based video games. The non-gamers either did not play at all or did so only infrequently.
Both groups played the weather prediction task. As they completed it, the researchers used MRI to record their brain activity.
To complete the task, the participants had to look at three cue cards with different patterns on them and then predict the weather. They were asked, “Will there be sun or rain?” They were then told straight away whether their answer was right or wrong.
As each card is only a partially accurate predictor of the weather, the correct answer is determined by the probability predicted by the combination.
For example, a combination of cue cards might contain: a card whose pattern means a 20 percent chance of rain and 80 percent chance of sun; a second card that means 80 percent chance of rain and 20 percent chance of sun; and a third that means 60 percent chance of rain and 40 percent chance of sun. The outcome of this combination would be rain.
The subjects performed the task again and again, with different combinations of cue cards. Thus, by receiving feedback, they learned which card combinations were related to which weather condition.
After they finished the task, the participants completed a questionnaire that tested how much knowledge they had retained about the cue card combinations.
The researchers found that the video gamers performed much better at predicting weather outcomes from cue card combinations than the non-gamers.
Even though some of the cue card combinations had a high uncertainty, the gamers still out-performed the non-gamers.
When the researchers analyzed the participants’ questionnaire responses, they found that the video gamers had retained more factual knowledge about the cue card combinations and the associated weather outcomes.
Analysis of the MRI scans revealed that both gamers and non-gamers showed the same level of activity in brain areas that are linked to “attention and executive function” and certain “memory-associated regions.”
However, the scans also showed notable brain differences between gamers and non-gamers. For example, the gamers showed stronger activity in the hippocampus and other brain areas that are important for “semantic memory, visual imagery, and cognitive control.”
“We think that playing video games trains certain brain regions like the hippocampus,” says first study author Sabrina Schenk.
The study’s findings are likely to be significant not only for young people, but also for older generations, because reduction in memory is linked to changes in the hippocampus.
“Maybe we can treat that with video games in the future,” suggests Schenk.
“Our study shows that gamers are better in analyzing a situation quickly, to generate new knowledge and to categorize facts – especially in situations with high uncertainties.”