W. Buntine. Journal of Artificial Intelligence Research, (1994)
Abstract
This paper is a multidisciplinary review of empirical, statistical learning
from a graphical model perspective. Well-known examples of graphical models
include Bayesian networks, directed graphs representing a Markov chain, and
undirected networks representing a Markov field. These graphical models are
extended to model data analysis and empirical learning using the notation of
plates. Graphical operations for simplifying and manipulating a problem are
provided including decomposition, differentiation, and the manipulation of
probability models from the exponential family. Two standard algorithm schemas
for learning are reviewed in a graphical framework: Gibbs sampling and the
expectation maximization algorithm. Using these operations and schemas, some
popular algorithms can be synthesized from their graphical specification. This
includes versions of linear regression, techniques for feed-forward networks,
and learning Gaussian and discrete Bayesian networks from data. The paper
concludes by sketching some implications for data analysis and summarizing how
some popular algorithms fall within the framework presented. The main original
contributions here are the decomposition techniques and the demonstration that
graphical models provide a framework for understanding and developing complex
learning algorithms.
Description
[cs/9412102] Operations for Learning with Graphical Models
%0 Journal Article
%1 Buntine:1994
%A Buntine, W. L.
%D 1994
%J Journal of Artificial Intelligence Research
%K bayesian buntine network plate proj:o4p
%P 159--225
%T Operations for Learning with Graphical Models
%U http://arxiv.org/abs/cs/9412102
%V 2
%X This paper is a multidisciplinary review of empirical, statistical learning
from a graphical model perspective. Well-known examples of graphical models
include Bayesian networks, directed graphs representing a Markov chain, and
undirected networks representing a Markov field. These graphical models are
extended to model data analysis and empirical learning using the notation of
plates. Graphical operations for simplifying and manipulating a problem are
provided including decomposition, differentiation, and the manipulation of
probability models from the exponential family. Two standard algorithm schemas
for learning are reviewed in a graphical framework: Gibbs sampling and the
expectation maximization algorithm. Using these operations and schemas, some
popular algorithms can be synthesized from their graphical specification. This
includes versions of linear regression, techniques for feed-forward networks,
and learning Gaussian and discrete Bayesian networks from data. The paper
concludes by sketching some implications for data analysis and summarizing how
some popular algorithms fall within the framework presented. The main original
contributions here are the decomposition techniques and the demonstration that
graphical models provide a framework for understanding and developing complex
learning algorithms.
@article{Buntine:1994,
abstract = { This paper is a multidisciplinary review of empirical, statistical learning
from a graphical model perspective. Well-known examples of graphical models
include Bayesian networks, directed graphs representing a Markov chain, and
undirected networks representing a Markov field. These graphical models are
extended to model data analysis and empirical learning using the notation of
plates. Graphical operations for simplifying and manipulating a problem are
provided including decomposition, differentiation, and the manipulation of
probability models from the exponential family. Two standard algorithm schemas
for learning are reviewed in a graphical framework: Gibbs sampling and the
expectation maximization algorithm. Using these operations and schemas, some
popular algorithms can be synthesized from their graphical specification. This
includes versions of linear regression, techniques for feed-forward networks,
and learning Gaussian and discrete Bayesian networks from data. The paper
concludes by sketching some implications for data analysis and summarizing how
some popular algorithms fall within the framework presented. The main original
contributions here are the decomposition techniques and the demonstration that
graphical models provide a framework for understanding and developing complex
learning algorithms.
},
added-at = {2010-05-27T23:32:30.000+0200},
author = {Buntine, W. L.},
biburl = {https://www.bibsonomy.org/bibtex/23332585ffd66af0b3834398e1b5b0d72/wnpxrz},
description = {[cs/9412102] Operations for Learning with Graphical Models},
interhash = {c7dd650780467c934551356630a7b739},
intrahash = {3332585ffd66af0b3834398e1b5b0d72},
journal = {Journal of Artificial Intelligence Research},
keywords = {bayesian buntine network plate proj:o4p},
pages = {159--225},
timestamp = {2010-05-27T23:32:30.000+0200},
title = {{Operations for Learning with Graphical Models}},
url = {http://arxiv.org/abs/cs/9412102},
volume = 2,
year = 1994
}