Inproceedings,

Infinite Markov-Switching Maximum Entropy Discrimination Machines

.
Proceedings of the 30th International Conference on Machine Learning, volume 28 of Proceedings of Machine Learning Research, page 729--737. Atlanta, Georgia, USA, PMLR, (17--19 Jun 2013)

Abstract

In this paper, we present a method that combines the merits of Bayesian nonparametrics, specifically stick-breaking priors, and large-margin kernel machines in the context of sequential data classification. The proposed model postulates a set of (theoretically) infinite interdependent large-margin classifiers as model components, that robustly capture local nonlinearity of complex data. The postulated large-margin classifiers are connected in the context of a Markov-switching construction that allows for capturing complex temporal dynamics in the modeled datasets. Appropriate stick-breaking priors are imposed over the component switching mechanism of our model to allow for data-driven determination of the optimal number of component large-margin classifiers, under a standard nonparametric Bayesian inference scheme. Efficient model training is performed under the maximum entropy discrimination (MED) framework, which integrates the large-margin principle with Bayesian posterior inference. We evaluate our method using several real-world datasets, and compare it to state-of-the-art alternatives.

Tags

Users

  • @kirk86
  • @dblp

Comments and Reviews