Irish neuroscientists have identified a specific brain signal that indicates whether speech has been understood.
Their discovery could have important potential applications, such as assessing brain function in unresponsive patients, tracking language development in infants and even determining the early onset of dementia.
Working with a team from the University of Rochester in the US, the neuroscientists from Trinity College Dublin (TCD) managed to identify the specific signal associated with the conversion of speech into understanding.
They found that this signal is present when a person has understood what they have heard, but is missing when they either did not understand, or were not paying attention.
According to the TCD team, people routinely speak at a rate of 120-200 words per minute during everyday interactions. In order for listeners to understand speech at these rates and not lose track of a conversation, their brains must understand the meaning of these words very quickly.
This is not an easy task given that the meaning of words can vary depending on the context. For example, ‘I saw a bat flying last night' and ‘I hit the ball with a baseball bat'.
Until now, the way in which our brains worked out the meaning of words in context was unclear. However, these latest findings show that our brains perform a rapid computation of the similarity in meaning that words have in relation to the words that come immediately before them.
The research involved recording electrical brainwave signals from the human scalp (EEG) while people listened to audiobooks. By analysing the brain activity of the participants, the scientists were able to identify a specific brain response that reflected how similar or different a given word was from the words that came before it in a story.
The scientists found that this signal disappeared completely when the participants could not understand the speech because it was too noisy, or if they simply were not paying attention to it.
As a result, this represents a very sensitive measure of whether a person actually understands what they hear.
"Potential applications include testing language development in infants, or determining the level of brain function in patients in a reduced state of consciousness.
"The presence or absence of the signal may also confirm if a person in a job that demands precision and speedy reactions, such as an air traffic controller or soldier, has understood the instructions they have received, and it may perhaps even be useful for testing for the onset of dementia in older people based on their ability to follow a conversation," explained lead researcher and Ussher Assistant Professor, Ed Lalor, of TCD.
Details of these findings are published in the journal, Current Biology.