Abstract
We determine the asymptotically optimal choice of the parameter $\nu$
for classifiers of $\nu$-support vector machine ($\nu$-SVM) type
which has been introduced by Schölkopf et al.. It turns out that
$\nu$ should be a close upper estimate of twice the optimal Bayes
risk provided that the classifier uses a so-called universal kernel
such as the Gaussian RBF kernel. Moreover, several experiments show
that this result can be used to implement modified cross validation
procedures which both train significantly faster and learn significantly
better than standard cross validation techniques.
Users
Please
log in to take part in the discussion (add own reviews or comments).