FearNet: Brain-Inspired Model for Incremental Learning
R. Kemker, and C. Kanan. (2017)cite arxiv:1711.10563Comment: Under review as a conference paper at ICLR 2018.
Abstract
Incremental class learning involves sequentially learning classes in bursts
of examples from the same class. This violates the assumptions that underlie
methods for training standard deep neural networks, and will cause them to
suffer from catastrophic forgetting. Arguably, the best method for incremental
class learning is iCaRL, but it requires storing training examples for each
class, making it challenging to scale. Here, we propose FearNet for incremental
class learning. FearNet is a generative model that does not store previous
examples, making it memory efficient. FearNet uses a brain-inspired dual-memory
system in which new memories are consolidated from a network for recent
memories inspired by the mammalian hippocampal complex to a network for
long-term storage inspired by medial prefrontal cortex. Memory consolidation is
inspired by mechanisms that occur during sleep. FearNet also uses a module
inspired by the basolateral amygdala for determining which memory system to use
for recall. FearNet achieves state-of-the-art performance at incremental class
learning on image (CIFAR-100, CUB-200) and audio classification (AudioSet)
benchmarks.
Description
FearNet: Brain-Inspired Model for Incremental Learning
%0 Generic
%1 kemker2017fearnet
%A Kemker, Ronald
%A Kanan, Christopher
%D 2017
%K incremental learning networks neural
%T FearNet: Brain-Inspired Model for Incremental Learning
%U http://arxiv.org/abs/1711.10563
%X Incremental class learning involves sequentially learning classes in bursts
of examples from the same class. This violates the assumptions that underlie
methods for training standard deep neural networks, and will cause them to
suffer from catastrophic forgetting. Arguably, the best method for incremental
class learning is iCaRL, but it requires storing training examples for each
class, making it challenging to scale. Here, we propose FearNet for incremental
class learning. FearNet is a generative model that does not store previous
examples, making it memory efficient. FearNet uses a brain-inspired dual-memory
system in which new memories are consolidated from a network for recent
memories inspired by the mammalian hippocampal complex to a network for
long-term storage inspired by medial prefrontal cortex. Memory consolidation is
inspired by mechanisms that occur during sleep. FearNet also uses a module
inspired by the basolateral amygdala for determining which memory system to use
for recall. FearNet achieves state-of-the-art performance at incremental class
learning on image (CIFAR-100, CUB-200) and audio classification (AudioSet)
benchmarks.
@misc{kemker2017fearnet,
abstract = {Incremental class learning involves sequentially learning classes in bursts
of examples from the same class. This violates the assumptions that underlie
methods for training standard deep neural networks, and will cause them to
suffer from catastrophic forgetting. Arguably, the best method for incremental
class learning is iCaRL, but it requires storing training examples for each
class, making it challenging to scale. Here, we propose FearNet for incremental
class learning. FearNet is a generative model that does not store previous
examples, making it memory efficient. FearNet uses a brain-inspired dual-memory
system in which new memories are consolidated from a network for recent
memories inspired by the mammalian hippocampal complex to a network for
long-term storage inspired by medial prefrontal cortex. Memory consolidation is
inspired by mechanisms that occur during sleep. FearNet also uses a module
inspired by the basolateral amygdala for determining which memory system to use
for recall. FearNet achieves state-of-the-art performance at incremental class
learning on image (CIFAR-100, CUB-200) and audio classification (AudioSet)
benchmarks.},
added-at = {2018-01-12T13:48:34.000+0100},
author = {Kemker, Ronald and Kanan, Christopher},
biburl = {https://www.bibsonomy.org/bibtex/2337787e0d1b2702b2a032f377534bdac/pavelkrolevets},
description = {FearNet: Brain-Inspired Model for Incremental Learning},
interhash = {70d98b0797c9b5f7a9a11353b4bad386},
intrahash = {337787e0d1b2702b2a032f377534bdac},
keywords = {incremental learning networks neural},
note = {cite arxiv:1711.10563Comment: Under review as a conference paper at ICLR 2018},
timestamp = {2018-01-12T13:48:34.000+0100},
title = {FearNet: Brain-Inspired Model for Incremental Learning},
url = {http://arxiv.org/abs/1711.10563},
year = 2017
}