Pittsburgh,
03
February
2022
|
15:00 PM
Europe/Amsterdam

Untangling Mixed (Neural) Signals

New Research Published in Current Biology Shows How “Polyglot” Neurons Encode and Decode Sensorimotor “Chatter”

Summary

New research led by the Cognition and Sensorimotor Integration Lab at the University of Pittsburgh Swanson School of Engineering has uncovered how neurons encode and decode information and differentiate between motor and sensory signals.

During sensorimotor processing in the brain, neurons are constantly bombarded with information from other neurons. When we use our eyes to interact with our environment, thousands of neurons communicate with each other to make sense of all the information coming in and react to it: If someone throws you a ball, your eyes track the ball, and a chain of neuron communication informs your hand where it must go to catch it.

But how these neurons communicate in between seeing and acting is a complex—and important—consideration. New research led by the Cognition and Sensorimotor Integration Lab at the University of Pittsburgh Swanson School of Engineering has uncovered how neurons encode and decode that information and differentiate between motor and sensory signals.

“We wanted to figure out how a decoder knows exactly when to initiate a movement if it is also getting signals when a movement isn’t desired,” said Uday K. Jagadisan, lead author and former graduate student in the Cognition and Sensorimotor Integration Lab. “We not only were able to uncover a reliable temporal pattern in the neuron activity that was tied to movement, but we were also able to replicate it with microstimulation.” 

The researchers studied how decoding happens when the signals lead to movement, trying to differentiate it from how information is encoded during visual processing. In other words, if the neurons are receiving both sensory and motor signals, how do they tell them apart? How does the brain know when to make the body move?

“The same groups of neurons can communicate information about sensations and movement, and the brain knows which signal is which. We found it’s as if groups of neurons encode the same information in one ‘language’ to send messages about sensation and in another ‘language’ to send information about movement,” explained Neeraj Gandhi, professor of bioengineering who leads the Cognition and Sensorimotor Integration Lab at Pitt. “The receiving groups of neurons only act on one of the languages—that’s the key.” 

The research is the first to both pinpoint the encoding and decoding process and verify the findings using microstimulation. The researchers were able to repeat the pattern of neural activity in non-human primate brains and elicit the intended motor reaction. 

This discovery is vital for applications like brain-computer interfaces and neuroprosthetics. These artificial systems can assist people who have suffered brain injuries or other disorders that affect motor or sensory processes, but in order to work reliably, they need to decode brain activity and understand the intentions behind the patterns of activity.

“For neuroprosthetics, this research could create a way to put the brakes on and inhibit response when you don’t need it, and release when actually needed, all based on neuron chatter,” said Jagadisan. “Current technology is just delivering a pulse every few milliseconds. If you have the ability to control the time when each pulse is delivered, you can select the patterned microstimulation to achieve the effect that you want.” 

The paper, “Population temporal structure supplements the rate code during sensorimotor transformations,” (DOI: 10.1016/j.cub.2022.01.015) was published in the journal Current Biology and authored by Jagadisan and Gandhi. Jagadisan is currently a postdoc at UC Berkeley studying how cortical communication networks generate visual perception.