Neuroscientists reveal how the brain is able to focus attention on objects, such as someone’s face, when presented with a range of visual information.
This kind of attention – which would allows us to pick a face we recognize out of a crowd of people – is called “object-based attention.” Scientists have previously known much less about this than “spatial attention,” which involves focusing on a particular location.
But the team behind the new study, from the Massachusetts Institute of Technology (MIT), find that these two variants of attention actually use similar mechanisms and related brain regions.
“The interactions are surprisingly similar to those seen in spatial attention,” says Robert Desimone, the Doris and Don Berkey Professor of Neuroscience, director of MIT’s McGovern Institute for Brain Research and senior author of the paper. “It seems like it’s a parallel process involving different areas.”
Both spatial and object-based attention are governed by the prefrontal cortex – the brain region that controls most cognitive functions. The prefrontal cortex decides which areas of the visual cortex receive sensory input. For example, in spatial attention, the visual cortex will map to the assigned area of focus within the field of what the viewer can see.
Prof. Desimone and team – who published their results in the journal Science – found that, in object-based attention, an area of the prefrontal cortex called the inferior frontal junction (IFJ) tunes visual processing areas to recognize specific types of objects.
The IFJ – which allows us to gather and coordinate information while performing a task – has two major allies in facilitating object-based attention.
These are a brain region responsible for processing faces, known as “the fusiform face area” (FFA), and a brain region that processes information about places – “the parahippocampal place area” (PPA).
To examine the relationship between these brain areas, the researchers used magnetoencephalography to scan the brains of participants who were presented with a series of overlapping images of faces and houses.
The two images were presented to the subjects at different rhythms – two images per second and 1.5 images per second.
Study lead author, Daniel Baldauf, also of the McGovern Institute, says:
“We wanted to frequency-tag each stimulus with different rhythms. When you look at all of the brain activity, you can tell apart signals that are engaged in processing each stimulus.”
The participants were instructed to pay attention to either faces or houses. This test was designed to measure object-based attention rather than spatial attention, because, as the faces and houses were in the same position, the participants’ brains could not distinguish them using spatial information.
When the subjects were told to look at faces, the FFA and IFJ synchronized in brain activity. And when the subjects were told to look at houses, the PPA and IFJ synchronized.
A further magnetic resonance imaging (MRI) scan also found that the IFJ is “highly connected” with white matter to both the FFA and PPA.
The MIT neuroscientists are now investigating how the brain is able to “shift focus” between different types of sensory input, such as sound and vision. From this research, they are interested in seeing whether people could be trained to control brain interactions that would better focus their attention.
Earlier this year, Medical News Today reported on a study by scientists at Washington University School of Medicine in St. Louis, MO, who found that the mechanisms that allow our brains to concentrate is “roughly akin to tuning multiple walkie-talkies to the same frequency.”