Repeated sequential learning increases memory capacity via effective decorrelation in a recurrent neural network

Tomoki Kurikawa, Omri Barak, Kunihiko Kaneko

Research output: Contribution to journalArticlepeer-review

Abstract

Memories in neural systems are shaped through the interplay of neural and learning dynamics under external inputs. This interplay can result in either overwriting or strengthening of memories as the system is repeatedly exposed to multiple input-output mappings, but it is unclear which effect dominates. By introducing a simple local learning rule to a neural network, we found that the memory capacity is drastically increased by sequentially repeating the learning steps of input-output mappings. We show that the resulting connectivity decorrelates the target patterns. This process is associated with the emergence of spontaneous activity that intermittently exhibits neural patterns corresponding to embedded memories. Stabilization of memories is achieved by a distinct bifurcation from the spontaneous activity under the application of each input.

Original languageEnglish
Article number023307
JournalPHYSICAL REVIEW RESEARCH
Volume2
Issue number2
DOIs
StatePublished - Jun 2020

All Science Journal Classification (ASJC) codes

  • General Physics and Astronomy

Fingerprint

Dive into the research topics of 'Repeated sequential learning increases memory capacity via effective decorrelation in a recurrent neural network'. Together they form a unique fingerprint.

Cite this