V. Mullachery, A. Khera, and A. Husain. (2018)cite arxiv:1801.07710Comment: arXiv admin note: text overlap with arXiv:1111.4246 by other authors.
Abstract
This paper describes and discusses Bayesian Neural Network (BNN). The paper
showcases a few different applications of them for classification and
regression problems. BNNs are comprised of a Probabilistic Model and a Neural
Network. The intent of such a design is to combine the strengths of Neural
Networks and Stochastic modeling. Neural Networks exhibit continuous function
approximator capabilities. Stochastic models allow direct specification of a
model with known interaction between parameters to generate data. During the
prediction phase, stochastic models generate a complete posterior distribution
and produce probabilistic guarantees on the predictions. Thus BNNs are a unique
combination of neural network and stochastic models with the stochastic model
forming the core of this integration. BNNs can then produce probabilistic
guarantees on it's predictions and also generate the distribution of parameters
that it has learnt from the observations. That means, in the parameter space,
one can deduce the nature and shape of the neural network's learnt parameters.
These two characteristics makes them highly attractive to theoreticians as well
as practitioners. Recently there has been a lot of activity in this area, with
the advent of numerous probabilistic programming libraries such as: PyMC3,
Edward, Stan etc. Further this area is rapidly gaining ground as a standard
machine learning approach for numerous problems
%0 Generic
%1 mullachery2018bayesian
%A Mullachery, Vikram
%A Khera, Aniruddh
%A Husain, Amir
%D 2018
%K BNN overview to_read
%T Bayesian Neural Networks
%U http://arxiv.org/abs/1801.07710
%X This paper describes and discusses Bayesian Neural Network (BNN). The paper
showcases a few different applications of them for classification and
regression problems. BNNs are comprised of a Probabilistic Model and a Neural
Network. The intent of such a design is to combine the strengths of Neural
Networks and Stochastic modeling. Neural Networks exhibit continuous function
approximator capabilities. Stochastic models allow direct specification of a
model with known interaction between parameters to generate data. During the
prediction phase, stochastic models generate a complete posterior distribution
and produce probabilistic guarantees on the predictions. Thus BNNs are a unique
combination of neural network and stochastic models with the stochastic model
forming the core of this integration. BNNs can then produce probabilistic
guarantees on it's predictions and also generate the distribution of parameters
that it has learnt from the observations. That means, in the parameter space,
one can deduce the nature and shape of the neural network's learnt parameters.
These two characteristics makes them highly attractive to theoreticians as well
as practitioners. Recently there has been a lot of activity in this area, with
the advent of numerous probabilistic programming libraries such as: PyMC3,
Edward, Stan etc. Further this area is rapidly gaining ground as a standard
machine learning approach for numerous problems
@misc{mullachery2018bayesian,
abstract = {This paper describes and discusses Bayesian Neural Network (BNN). The paper
showcases a few different applications of them for classification and
regression problems. BNNs are comprised of a Probabilistic Model and a Neural
Network. The intent of such a design is to combine the strengths of Neural
Networks and Stochastic modeling. Neural Networks exhibit continuous function
approximator capabilities. Stochastic models allow direct specification of a
model with known interaction between parameters to generate data. During the
prediction phase, stochastic models generate a complete posterior distribution
and produce probabilistic guarantees on the predictions. Thus BNNs are a unique
combination of neural network and stochastic models with the stochastic model
forming the core of this integration. BNNs can then produce probabilistic
guarantees on it's predictions and also generate the distribution of parameters
that it has learnt from the observations. That means, in the parameter space,
one can deduce the nature and shape of the neural network's learnt parameters.
These two characteristics makes them highly attractive to theoreticians as well
as practitioners. Recently there has been a lot of activity in this area, with
the advent of numerous probabilistic programming libraries such as: PyMC3,
Edward, Stan etc. Further this area is rapidly gaining ground as a standard
machine learning approach for numerous problems},
added-at = {2018-02-10T13:44:10.000+0100},
author = {Mullachery, Vikram and Khera, Aniruddh and Husain, Amir},
biburl = {https://www.bibsonomy.org/bibtex/259aa3c9c53406c3d230260d4a3589e11/jk_itwm},
description = {1801.07710.pdf},
interhash = {88ac568b2dc03a52f99ec5b28b880919},
intrahash = {59aa3c9c53406c3d230260d4a3589e11},
keywords = {BNN overview to_read},
note = {cite arxiv:1801.07710Comment: arXiv admin note: text overlap with arXiv:1111.4246 by other authors},
timestamp = {2018-02-10T13:44:10.000+0100},
title = {Bayesian Neural Networks},
url = {http://arxiv.org/abs/1801.07710},
year = 2018
}