Techreport,

On the optimal parameter choice for $\nu$-support vector machines

.
University Jena, (2002)

Abstract

We determine the asymptotically optimal choice of the parameter $\nu$ for classifiers of $\nu$-support vector machine ($\nu$-SVM) type which has been introduced by Schölkopf et al.. It turns out that $\nu$ should be a close upper estimate of twice the optimal Bayes risk provided that the classifier uses a so-called universal kernel such as the Gaussian RBF kernel. Moreover, several experiments show that this result can be used to implement modified cross validation procedures which both train significantly faster and learn significantly better than standard cross validation techniques.

Tags

Users

  • @kdubiq

Comments and Reviews