Remember the voice of Charlie Brown's teacher: "Wha wha wha wha ..."? Sometimes we can hear someone speaking, but that doesn't always mean we're tuning in. Now neuroscientists have identified a way of detecting brain signals that indicate when a person is actually understanding speech.
The technique, which applies machine learning, represents an advance in using relatively inexpensive EEG (electroencephalography) technology (think skullcap outfitted with wired electrodes) to monitor electrical brain activity and assess comprehension. The approach could offer a sensitive measure of language development in infants and comprehension among patients, including those in a reduced state of consciousness.
Advertisement
"Speech is really amazing, but we're so used to it and our brains are so good at understanding a steady stream of words that we often take it for granted," says Edmund Lalor, associate professor of biomedical engineering and neuroscience at the University of Rochester and Trinity College Dublin. Lalor, who oversaw the study published March 2018 in the journal Current Biology, points out that words roll off the tongues of most speakers at an impressive clip — about 120 to 200 words per minute. Our brains, when alert, have little problem keeping up and interpreting the sounds as syllables, words, paragraphs and meaning.
To detect how much processing is happening as we interpret speech, Lalor and colleagues, including graduate student and lead author, Michael Broderick, first applied machine learning to audiobook recordings and addresses by former President Barack Obama to evaluate when key moments of comprehension should occur. "The machine learning ended up producing a big long vector of numbers for every word," Lalor says. "A word with high number value carries big meaning and should evoke a stronger EEG response." The machine learning's readout aligned with EEG readings from the brains of people listening to the same recordings, he says. Spikes in electrical charge from the brain corresponded to key moments of understanding.
To further test that the spikes in electrical signals corresponded to what people were hearing, the team took EEG readings in other situations where listening was compromised. In one, background noise made hearing the speaker difficult, and the listener's brain signals showed a weaker response (comprehension was improved when the listener was also able to see a video of the speaker). In another experiment, the listener's attention was muddled by the competing sound of another narrator telling a different story at the same time. "This simulates a real-world environment where you're in a noisy room and you have to focus your attention on one speaker and ignore the voices of everyone else around you," explains Broderick. In that noisy bar-like scenario, the subject's EEG readings also showed decreased comprehension.
Finally, the team played audiobook recordings backward. In those tests, Broderick says, the brain's response "disappeared" because the sounds clearly made no sense to the listeners.
The practice of detecting comprehension using EEGs isn't new. Since the early 1980s, researchers have pinpointed EEG spikes when people hear unexpected words in a series. The so-called N400 response is typically seen about 400 milliseconds after the incongruent word is heard. The problem, says Lalor, is in interpreting the meaning of the N400 signal.
"You give people a sentence like 'the dentist told me to brush my tree,' and it has this weird violation in it and so your brain responds. People are still debating what that response might mean," Lalor says. "What's different about ours is it's built on an assessment of the amount of meaning of each word in context. So it may allow us to better understand different aspects of linguistic processing."
While Lalor admits their model could be refined, he suggests that the EEG tests, which are much cheaper and more accessible than MRIs, could find several useful applications. Parents who may be concerned about their child's development could request them for children even as young as 18 months, when early language comprehension begins. At that stage, EEG readings should signal some basic language comprehension, Lalor says. If they don't, it could be an early sign of a language problem, which might not otherwise be diagnosed until a later age.
The testing also could offer an inexpensive way of evaluating patients in an apparent vegetative state. Lalor says it could be as simple as attaching electrodes to a patient, having them listen to an audiobook for an hour and then running analysis. Damian Cruse, a psychologist at the U.K.'s University of Birmingham who has analyzed brain function among people with compromised consciousness, called the team's method "very promising," adding that approaches like these could "provide families and caregivers with vital information."
Looking to the future, Lalor even envisions developing wearable EEG tests that could take an instant read on whether, say, a soldier in a field of battle, or a pilot in a busy airspace is not only hearing instructions, but — unlike the students in Charlie Brown's classroom — also registering them.
"If the signals are there," he says, "then you know that they're understanding."
Advertisement