Listening and reading evoke almost identical brain activity

woman reading

Whether the words of a story come from listening or reading, it appears that the brain activates the same areas to represent their semantics, or meaning, according to new research.
Using detailed brain scans, scientists at the University of California (UC), Berkeley, have created interactive 3D semantic maps that can accurately predict which parts of the brain will respond to particular categories of words.
“At a time when more people are absorbing information via audiobooks, podcasts, and even audio texts,” says lead study author Fatma Deniz, a postdoctoral researcher in neuroscience at UC, Berkeley, “our study shows that, whether they’re listening to or reading the same materials, they are processing semantic information similarly.”
3D semantic maps
To create the 3D semantic brain maps, the team invited volunteers to listen to and read the same stories while they recorded detailed functional MRI scans of their brains.
The scans enabled the researchers to monitor brain activity by measuring blood flow in different parts of the brain.
The researchers matched the brain activity with time coded transcripts of the stories. That way, they could tell which part of the brain responded to each word.
They also used a computer program to allocate the thousands of words in the stories to semantic categories. For example, the words “cat,” “fish,” and “bear” all belong to the category “animal.”
Potential applications of semantic maps
The researchers foresee the study’s findings helping to increase understanding of how the brain processes language.
The semantic maps could also aid the study of healthy people and those with conditions that affect brain function, such as stroke, epilepsy, and injuries that can impair speech.
Deniz suggests that the maps could also give fresh insights into dyslexia, a common neurological condition that impairs ability to read.