Random compressed coding with neurons

Simone Blanco Malerba, Mirko Pieropan, Yoram Burak, Rava Azeredo da Silveira

Research output: Contribution to journalArticlepeer-review

Abstract

Classical models of efficient coding in neurons assume simple mean responses—"tuning curves”— such as bell-shaped or monotonic functions of a stimulus feature. Real neurons, however, can be more complex: grid cells, for example, exhibit periodic responses that impart the neural population code with high accuracy. But do highly accurate codes require fine-tuning of the response properties? We address this question with the use of a simple model: a population of neurons with random, spatially extended, and irregular tuning curves. Irregularity enhances the local resolution of the code but gives rise to catastrophic, global errors. For optimal smoothness of the tuning curves, when local and global errors balance out, the neural population compresses information about a continuous stimulus into a low-dimensional representation, and the resulting distributed code achieves exponential accuracy. An analysis of recordings from monkey motor cortex points to such “compressed efficient coding.” Efficient codes do not require a finely tuned design—they emerge robustly from irregularity or randomness.

Original languageEnglish
Article number115412
JournalCell Reports
Volume44
Issue number3
DOIs
StatePublished - 25 Mar 2025

Keywords

  • CP: Neuroscience
  • Gaussian process
  • efficient coding
  • neural coding
  • receptive field

All Science Journal Classification (ASJC) codes

  • General Biochemistry,Genetics and Molecular Biology

Cite this