Article,

Boltzmann machines and energy-based models

.
(2017)cite arxiv:1708.06008Comment: 36 pages. The topics covered in this paper are presented in Part I of IJCAI-17 tutorial on energy-based machine learning. https://researcher.watson.ibm.com/researcher/view_group.php?id=7834.

Abstract

We review Boltzmann machines and energy-based models. A Boltzmann machine defines a probability distribution over binary-valued patterns. One can learn parameters of a Boltzmann machine via gradient based approaches in a way that log likelihood of data is increased. The gradient and Hessian of a Boltzmann machine admit beautiful mathematical representations, although computing them is in general intractable. This intractability motivates approximate methods, including Gibbs sampler and contrastive divergence, and tractable alternatives, namely energy-based models.

Tags

Users

  • @kirk86
  • @dblp

Comments and Reviews