Study captures how two brain systems in mammals cooperate to capture sounds and process information.
When people hear the sound of footsteps or the drilling of a woodpecker, the rhythmic structure of the sounds is striking, says Michael Wehr, a professor of psychology at the University of Oregon.
Even when the temporal structure of a sound is less obvious, as with human speech, the timing still conveys a variety of important information, he says. When a sound is heard, neurons in the lower subcortical region of the brain fire in sync with the rhythmic structure of the sound, almost exactly encoding its original structure in the timing of spikes.
A new study by a team of scientists at the University of Oregon, detailed in the April 8 issue of the journal Neuron, documents this transformation of information in the auditory system of rats. The findings are similar to those previously shown in primates, suggesting that the processes are a general feature in the auditory systems of all mammals, the team concluded.
Neurons in the brain use two different “languages” to encode information: temporal coding and rate coding. For neurons in the auditory thalamus, the part of the brain that relays information from the ears to the auditory cortex, this takes the form of temporal coding. The neurons fire in sync with the original sound, providing an exact replication of the sound’s structure in time. In the auditory cortex, however, about half the neurons use rate coding, which instead conveys the structure of the sound through the density and rate of the neurons’ spiking, rather than the exact timing.
But how does the transformation from one coding system to another take place?
Example of temporal coding (synchronized) and rate coding (non-synchronized) neurons:
To find the answer, Wehr and an undergraduate student—lead author Xiang Gao, now a medical student at the Oregon Health & Science University—used a technique known as whole-cell recording to capture the thousands of interactions that take place within a single neuron each time it responds to a sound. The team observed how individual cells responded to a steady of stream of rhythmic clicks.
During the study, Wehr and Gao noted that individual rate-coding neurons received up to 82% of their inputs from temporal-coding neurons. “This means that these neurons are acting like translators, converting a sound from one language to another,” Wehr said. “By peering inside a neuron, we can see the mechanism for how the translation is taking place.”
One of these mechanisms is the way that so-called excitatory and inhibitory neurons cooperate to “push and pull” together, like parents pushing a playground teeter-totter. In response to each click, the excitatory neurons first “push” on a cell, and then inhibitory neurons follow with a “pull” exactly out of phase with the excitatory neurons. Together, this combination of pushing and pulling drives cells to fire spikes at a high rate, converting the temporal code into a rate code.
The observation provides a glimpse into how circuits deep within the brain give rise to our perception of the world. Prior to the study, neuroscientists had speculated that the transformation from temporal coding to rate coding might explain the perceptual boundary we experience between rhythm and pitch. Slow trains of clicks sound rhythmic, but fast trains of clicks sound like a buzzy tone. It could be that these two very different experiences of sound are produced by the two different kinds of neurons.
In the study from the University of Oregon, synchronized neurons, using temporal coding, were able to track a click train up to a speed of about 20 clicks per second. At this point, the non-synchronized, rate-coding neurons began to take over. These non-synchronized neurons could respond to much faster speeds, up to about 500 clicks per second, but with a rate code in which the neurons fired in a random, disconnected pattern.
Why would the auditory system switch to a less faithful representation? The answer may lie in the visual cortex, which also uses rate coding.
“This transformation in the auditory system is similar to what has been observed in the visual system,” Wehr said. “Except that in the auditory system, neurons are encoding information about time instead of about space.”
Neuroscientists believe rate codes could support multi-sensory integration in the higher cortical areas, Wehr said. A rate code could be a “universal language,” helping us make decisions by putting together what we see and what we hear.
The National Institutes of Health funded the study (1R01-DC-011379).