Ian Birkby, CEO at News-Medical | News-Medical
+ Pharmaceuticals
Patient Daily | Mar 11, 2026

Learning found to boost coordination among sensory neurons during skill development

A recent study from the University of Rochester and its Del Monte Institute for Neuroscience has found that as people improve at certain skills, such as recognizing faces or identifying patterns, sensory neurons in the brain become more coordinated. This challenges a longstanding belief in neuroscience that learning increases efficiency by making neurons act more independently.

The research team, led by graduate student Shizhao Liu and faculty members Ralf Haefner and Adam Snyder from the Department of Brain and Cognitive Sciences, published their findings in Science. Their work suggests that instead of minimizing repetition across neural signals, learning actually increases shared activity among neurons.

"The dominant view in neuroscience has been that learning makes the brain more efficient by pushing neurons to act more independently, so information can be read out more cleanly," Liu said. "Our results support a different idea, that sensory areas of the brain aren't just passively encoding the world. They're actively performing inference by combining what's coming in with what the brain has learned to expect."

For many years, researchers thought that reducing shared activity among neurons allowed for more efficient processing of information. However, this new study indicates that as learning occurs, neurons share more information and coordinate their actions—especially when individuals are engaged in tasks requiring decisions.

This increased coordination appears to be linked to internal expectations shaped by higher-level brain areas. As people learn, feedback from these areas influences how sensory neurons respond, allowing perception to integrate both new input and prior experience.

The team monitored small networks of neurons in the visual cortex over several weeks while subjects learned to distinguish between different visual patterns. They observed that before learning took place, neurons mostly operated independently. As subjects improved their skills, however, neuron activity became increasingly coordinated—particularly during moments when decisions were made based on visual input.

Notably, this effect was only present when subjects were actively involved in a task; it disappeared when they passively viewed images without needing to respond. The most significant increase in coordination occurred among neurons most relevant to the task at hand.

These changes are not permanent but appear flexible and guided by feedback from higher-level regions of the brain. This flexibility allows neuronal behavior to adjust depending on current demands.

The findings contribute to a growing perspective in neuroscience: rather than simply relaying information forward like a conveyor belt, the brain combines incoming data with expectations formed from past experiences. This process requires groups of neurons to work together rather than separately.

Understanding how neuron coordination develops during learning could help researchers better understand learning disorders and perceptual conditions. It may also inspire advances in artificial intelligence systems by mimicking how the human brain blends prior knowledge with new sensory input for greater adaptability.

"Most current artificial intelligence systems are built on discriminative architectures that map sensory inputs directly to outputs," Haefner said. "Our new research suggests that incorporating generative feedback loops—in which internal models shape sensory representations—may lead to systems that learn faster from limited data, are more robust to uncertainty, and adapt more flexibly to changing tasks."

Organizations in this story