Inproceedings,

Generalized Support Vector Machines

.
Advances in Large Margin Classifiers, page 135-146. Cambridge, MA, MIT Press, (2000)

Abstract

Arbitrary kernel functions which need not satisfy Mercer's condition can be used. This goal is achieved by separating the regularizer from the actual separation condition. For quadratic regularization this leads to a convex quadratic program that is no more difficult to solve than the standard SV optimization problem. Sparse expansions are achieved when the $1$-norm of the expansion coefficients is chosen to restrict the class of admissible functions. The problems are formulated in a way which is compatible with Mathematical Programming literature.

Tags

Users

  • @kdubiq

Comments and Reviews