Incremental and Decremental Support Vector Machine Learning
G. Cauwenberghs, and T. Poggio. Advances in Neural Information Processing Systems (NIPS*2000), 13, (2001)
Abstract
An on-line recursive algorithm for training support vector machines,
one vector at a time, is presented. Adiabatic increments retain
the Kuhn-Tucker conditions on all previously seen training data,
in a number of steps each computed analytically. The incremental
procedure is reversible, and decremental ``unlearning'' offers an
efficient method to exactly evaluate leave-one-out generalization
performance. Interpretation of decremental unlearning in feature
space sheds light on the relationship between generalization and
geometry of the data.
%0 Conference Paper
%1 cauwenbergs01incrementaldecremental
%A Cauwenberghs, G.
%A Poggio, T.
%B Advances in Neural Information Processing Systems (NIPS*2000)
%D 2001
%K Blueprint KDubiq kdubiq
%T Incremental and Decremental Support Vector Machine Learning
%U http://bach.ece.jhu.edu/pub/gert/papers/nips00_inc.pdf
%V 13
%X An on-line recursive algorithm for training support vector machines,
one vector at a time, is presented. Adiabatic increments retain
the Kuhn-Tucker conditions on all previously seen training data,
in a number of steps each computed analytically. The incremental
procedure is reversible, and decremental ``unlearning'' offers an
efficient method to exactly evaluate leave-one-out generalization
performance. Interpretation of decremental unlearning in feature
space sheds light on the relationship between generalization and
geometry of the data.
@inproceedings{cauwenbergs01incrementaldecremental,
abstract = {An on-line recursive algorithm for training support vector machines,
one vector at a time, is presented. Adiabatic increments retain
the Kuhn-Tucker conditions on all previously seen training data,
in a number of steps each computed analytically. The incremental
procedure is reversible, and decremental ``unlearning'' offers an
efficient method to exactly evaluate leave-one-out generalization
performance. Interpretation of decremental unlearning in feature
space sheds light on the relationship between generalization and
geometry of the data. },
added-at = {2008-04-30T12:59:47.000+0200},
author = {Cauwenberghs, G. and Poggio, T.},
biburl = {https://www.bibsonomy.org/bibtex/2f85c96fc2bc9009b0ed632e00d58ae92/kdubiq},
booktitle = {Advances in Neural Information Processing Systems (NIPS*2000)},
description = {KDubiq Blueprint},
groupsearch = {0},
interhash = {9eaa2b8037ebbef1ae5a073097a2e423},
intrahash = {f85c96fc2bc9009b0ed632e00d58ae92},
keywords = {Blueprint KDubiq kdubiq},
timestamp = {2008-05-07T11:22:24.000+0200},
title = {Incremental and Decremental Support Vector Machine Learning},
url = {http://bach.ece.jhu.edu/pub/gert/papers/nips00_inc.pdf},
volume = 13,
year = 2001
}