How Brain Waves Surf Sound Waves to Process Speech

How Brain Waves Surf Sound Waves to Process Speech

When people listen to speech, their ears translate the sound waves into neural signals that are then processed and interpreted by various parts of the brain, starting with the auditory cortex. Years of neurophysiological studies have observed that the waves of neural activity in the auditory cortex lock onto the audio signal’s “envelope”—essentially, the frequency with which the loudness changes. (As Poeppel put it, “The brain waves surf on the sound waves.”

) By very faithfully “entraining” on the audio signal in this way, the brain presumably segments the speech into manageable chunks for processing. In her experiments, Assaneo had people listen to nonsensical strings of syllables played at rates between 2 and 7 hertz while she measured the activity in their auditory and speech motor cortices.

(She used nonsense syllables so that the brain would have no semantic response to the speech, in case that might indirectly affect the motor areas. “When we perceive intelligible speech, the brain network being activated is more complex and extended,” she explained.) If the signals in the auditory cortex drive those in the motor cortex, then they should stay entrained to each other throughout the tests.

If the motor cortex signal is independent, it should not change. But what Assaneo observed was rather more interesting and surprising, Poeppel said: The auditory and speech motor activities did stay entrained,but only up to about 5 hertz. Once the audio changed faster than spoken language typically does, the motor cortex dropped out of sync.

A computational model later confirmed that these results were consistent with the idea that the motor cortex has its own internal oscillator that naturally operates at around 4 to 5 hertz.

Source: nautil.us