Abstract
Memories in neural systems are shaped through the interplay of neural and learning dynamics under external inputs. This interplay can result in either overwriting or strengthening of memories as the system is repeatedly exposed to multiple input-output mappings, but it is unclear which effect dominates. By introducing a simple local learning rule to a neural network, we found that the memory capacity is drastically increased by sequentially repeating the learning steps of input-output mappings. We show that the resulting connectivity decorrelates the target patterns. This process is associated with the emergence of spontaneous activity that intermittently exhibits neural patterns corresponding to embedded memories. Stabilization of memories is achieved by a distinct bifurcation from the spontaneous activity under the application of each input.
| Original language | English |
|---|---|
| Article number | 023307 |
| Journal | PHYSICAL REVIEW RESEARCH |
| Volume | 2 |
| Issue number | 2 |
| DOIs | |
| State | Published - Jun 2020 |
All Science Journal Classification (ASJC) codes
- General Physics and Astronomy
Fingerprint
Dive into the research topics of 'Repeated sequential learning increases memory capacity via effective decorrelation in a recurrent neural network'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver