Understanding the spoken word is probably one of the most complex tasks the human brain can perform. In regular conversation, the average rate of speech is about 120 to 200 words per minute.
Not only does the speaker have to understand the language well enough to use the correct words to convey their thoughts, but also the person listening has to understand each word at a rapid pace. Including words that are the same but have different meanings depending on the context. New research from the University of Rochester has found the mechanism in the brain that indicates when a person has understood what is being said. Knowing how this works could pave the way for healthcare professionals to assess patients who might have an injury or condition that impacts cognition.
Edmund Lalor is an associate professor of biomedical engineering and neuroscience at the University of Rochester and Trinity College in Dublin. He explained in a press release, "That we can do this so easily is an amazing feat of the human brain – especially given that the meaning of words can vary greatly depending on the context. For example, ‘I saw a bat flying overhead last night' versus ‘the baseball player hit a home run with his favorite bat.'"
Research into this brain function used a fairly ordinary method of reading brain wave patterns with electroencephalography (EEG.) Using electrodes that sit on a patient's scalp, an EEG can monitor signals from the brain, and this is where the team discovered how to interpret EEG readings to show language comprehension.
The research involved some machine learning as well as EEG strips. Using audiobooks, the text was read into a computer that was "trained" to recognize patterns in words. Given the thousands of words contained in just one book, the machine eventually learns which words are supposed to go together. Each word essentially becomes numerical, in lines of code, and the software can determine the meaning of words by how these numbers are assigned. When patients in the study listened to sections of the audiobooks, EEG recordings were made. The data on the strips were then correlated with the numerical measures to show precisely where the brain indicated understanding.
But how was it possible to verify that this signal meant that comprehension was taking place? The team used one experiment involving Hemingway's class "The Old Man and the Sea." Patients listened to portions of the book, and the investigators noted the signals in the EEG readings that they believed showed understanding. Then the researchers changed it up. Lalor stated, "We could see brain signals telling us that people could understand what they were hearing. When we had the same people come back and hear the same audiobook played backward, the signal disappears entirely."
To further test the theory, they asked participants to listen to a recording of a speech make President Barack Obama, but they added some background noise that made it nearly impossible to catch more than a few words here and there. Some signals were detected that showed partial understanding, but it was weak. When the participants were able to view a video of the speech, the researchers noted at the signal "intensified dramatically" because visual cues in the video allowed the study volunteers to understand what it was they couldn't hear precisely.
Their study is published in the journal Current Biology. The project is continuing to look into how the brain processes the meaning of words, sounds, and other stimuli. They hope that the findings might be useful in assessing brain function for patients who are comatose or otherwise impaired, for children to test language development and possibly in older patients to see if cognition in daily conversation is declining.
Sources. Rochester University, Current Biology Trinity, Dublin