Inproceedings,

Incremental and Decremental Support Vector Machine Learning

, and .
Advances in Neural Information Processing Systems (NIPS*2000), 13, (2001)

Abstract

An on-line recursive algorithm for training support vector machines, one vector at a time, is presented. Adiabatic increments retain the Kuhn-Tucker conditions on all previously seen training data, in a number of steps each computed analytically. The incremental procedure is reversible, and decremental ``unlearning'' offers an efficient method to exactly evaluate leave-one-out generalization performance. Interpretation of decremental unlearning in feature space sheds light on the relationship between generalization and geometry of the data.

Tags

Users

  • @kdubiq

Comments and Reviews