Task-agnostic continual learning using online variational bayeswith fixed-point updates

Chen Zeno, Itay Golan, Elad Hoffer, Daniel Soudry

Research output: Contribution to journalArticlepeer-review

Abstract

Catastrophic forgetting is the notorious vulnerability of neural networks to the changes in the data distribution during learning. This phenomenon has long been considered a major obstacle for using learning agents in realistic continual learning settings. A large body of continual learning research assumes that task boundaries are known during training. However, only a few works consider scenarios in which task boundaries are unknown or not well defined: Task-agnostic scenarios. The optimal Bayesian solution for this requires an intractable online Bayes update to the weights posterior. We aim to approximate the online Bayes update as accurately as possible. To do so, we derive novel fixed-point equations for the online variational Bayes optimization problem for multivariate gaussian parametric distributions. By iterating the posterior through these fixed-point equations, we obtain an algorithm (FOO-VB) for continual learning that can handle nonstationary data distribution using a fixed architecture and without using external memory (i.e., without access to previous data). We demonstrate that our method (FOO-VB) outperforms existing methods in task-agnostic scenarios. FOO-VB Pytorch implementation is available at https://github.com/chenzeno/FOO-VB.

Original languageEnglish
Pages (from-to)3139-3177
Number of pages39
JournalNeural Computation
Volume33
Issue number11
DOIs
StatePublished - 12 Oct 2021

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Fingerprint

Dive into the research topics of 'Task-agnostic continual learning using online variational bayeswith fixed-point updates'. Together they form a unique fingerprint.

Cite this