Prof. Amir Amedi demonstrates the concept of sensory substitution through the use of a Sensory Substitution Device.
Image credit: Sasson Tiram/Hebrew University of Jerusalem
Conventional wisdom dictates that the brain is divided into distinct regions by the sensory inputs that activate them. Therefore, the visual cortex is the region of the brain that processes sight, and the auditory cortex is where sound is interpreted. Within these regions are then thought to be specialized sub-regions that handle specific tasks, such as identifying - independently of sounds or meaning - number symbols or words and letters.
At Hebrew University's Amedi Lab for Brain and Multisensory Research, however, work with unique tools called "Sensory Substitution Devices" (SSDs) - the results of which are published in Nature Communications - has challenged this view.
The SSDs take information provided by one sense and present it in another. One example of this sense substitution in action is a project where visual images from smartphones and webcams was translated into a soundscape that allowed blind users to create a mental image of objects, including physical dimension and color.
The researchers claim they also enabled blind users of the SSDs to "read" letters using this method of identifying written words using sound.
"These devices can help the blind in their everyday life," explains Prof. Amir Amedi, "but they also open unique research opportunities by letting us see what happens in brain regions normally associated with one sense, when the relevant information comes from another."
Of particular interest to the team was whether the blind people in the study used the "visual word form area" sub-region of the brain - the area that identifies the shapes of words and letters in sighted people's brains - to process this information.
Blind people use 'visual' brain regions when interpreting sound cues about objects
According to lead researcher Sami Abboud, the blind participants did indeed "see through sound" using the same "visual" brain regions to interpret this information as sighted people. "These regions are preserved and functional even among the congenitally blind who have never experienced vision," Abboud confirms.
Therefore, blind people reading Braille using their fingers will still use the "visual" areas of their brain as they process the text.
Using functional MRI imaging (fMRI), the researchers studied the brain activity of their blind participants in real-time while they used the SSDs. They found that specialized brain areas are activated by the task the brain is handling at that time, rather than by the sense involved.
This discovery leads to another question: why do these functions develop in specific anatomical locations if sensory input is not the key to their development?
Prof. Amedi suggests an explanation may lie in constantly evolving connectivity patterns between sub-regions of the brain, such as the the visual word form area and language-processing areas.
"This means that the main criteria for a reading area to develop is not the letters' visual symbols, but rather the area's connectivity to the brain's language-processing centers," he hypothesizes. "Similarly a number area will develop in a region which already has connections to quantity-processing regions."
This type of mechanism, Prof. Amedi says, can help explain how our brains adapt quickly to constantly changing cultural and technological innovations:
"If we take this one step further, this connectivity-based mechanism might explain how brain areas could have developed so quickly on an evolutionary timescale. We've only been reading and writing for several thousand years, but the connectivity between relevant areas allowed us to create unique new centers for these specialized tasks. This same 'cultural recycling' of brain circuits could also be true for how we will adapt to new technological and cultural innovations in the current era of rapid innovation, even approaching the potential of the Singularity."