@pavelkrolevets

FearNet: Brain-Inspired Model for Incremental Learning

, and . (2017)cite arxiv:1711.10563Comment: Under review as a conference paper at ICLR 2018.

Abstract

Incremental class learning involves sequentially learning classes in bursts of examples from the same class. This violates the assumptions that underlie methods for training standard deep neural networks, and will cause them to suffer from catastrophic forgetting. Arguably, the best method for incremental class learning is iCaRL, but it requires storing training examples for each class, making it challenging to scale. Here, we propose FearNet for incremental class learning. FearNet is a generative model that does not store previous examples, making it memory efficient. FearNet uses a brain-inspired dual-memory system in which new memories are consolidated from a network for recent memories inspired by the mammalian hippocampal complex to a network for long-term storage inspired by medial prefrontal cortex. Memory consolidation is inspired by mechanisms that occur during sleep. FearNet also uses a module inspired by the basolateral amygdala for determining which memory system to use for recall. FearNet achieves state-of-the-art performance at incremental class learning on image (CIFAR-100, CUB-200) and audio classification (AudioSet) benchmarks.

Description

FearNet: Brain-Inspired Model for Incremental Learning

Links and resources

Tags

community

  • @dblp
  • @pavelkrolevets
@pavelkrolevets's tags highlighted