An approximate KLD based experimental design for models with
intractable likelihoods
Z. Ao, and J. Li. (2020)cite arxiv:2004.00715Comment: To appear in AISTATS 2020.
Abstract
Data collection is a critical step in statistical inference and data science,
and the goal of statistical experimental design (ED) is to find the data
collection setup that can provide most information for the inference. In this
work we consider a special type of ED problems where the likelihoods are not
available in a closed form. In this case, the popular information-theoretic
Kullback-Leibler divergence (KLD) based design criterion can not be used
directly, as it requires to evaluate the likelihood function. To address the
issue, we derive a new utility function, which is a lower bound of the original
KLD utility. This lower bound is expressed in terms of the summation of two or
more entropies in the data space, and thus can be evaluated efficiently via
entropy estimation methods. We provide several numerical examples to
demonstrate the performance of the proposed method.
Description
[2004.00715] An approximate {KLD} based experimental design for models with intractable likelihoods
%0 Journal Article
%1 ao2020approximate
%A Ao, Ziqiao
%A Li, Jinglai
%D 2020
%K approximate bayesian divergences information uncertainty
%T An approximate KLD based experimental design for models with
intractable likelihoods
%U http://arxiv.org/abs/2004.00715
%X Data collection is a critical step in statistical inference and data science,
and the goal of statistical experimental design (ED) is to find the data
collection setup that can provide most information for the inference. In this
work we consider a special type of ED problems where the likelihoods are not
available in a closed form. In this case, the popular information-theoretic
Kullback-Leibler divergence (KLD) based design criterion can not be used
directly, as it requires to evaluate the likelihood function. To address the
issue, we derive a new utility function, which is a lower bound of the original
KLD utility. This lower bound is expressed in terms of the summation of two or
more entropies in the data space, and thus can be evaluated efficiently via
entropy estimation methods. We provide several numerical examples to
demonstrate the performance of the proposed method.
@article{ao2020approximate,
abstract = {Data collection is a critical step in statistical inference and data science,
and the goal of statistical experimental design (ED) is to find the data
collection setup that can provide most information for the inference. In this
work we consider a special type of ED problems where the likelihoods are not
available in a closed form. In this case, the popular information-theoretic
Kullback-Leibler divergence (KLD) based design criterion can not be used
directly, as it requires to evaluate the likelihood function. To address the
issue, we derive a new utility function, which is a lower bound of the original
KLD utility. This lower bound is expressed in terms of the summation of two or
more entropies in the data space, and thus can be evaluated efficiently via
entropy estimation methods. We provide several numerical examples to
demonstrate the performance of the proposed method.},
added-at = {2020-04-03T16:41:55.000+0200},
author = {Ao, Ziqiao and Li, Jinglai},
biburl = {https://www.bibsonomy.org/bibtex/2524f44e7d781c78c296360c29fa3865c/kirk86},
description = {[2004.00715] An approximate {KLD} based experimental design for models with intractable likelihoods},
interhash = {4360b2808be5b0d2533c4c561c78792d},
intrahash = {524f44e7d781c78c296360c29fa3865c},
keywords = {approximate bayesian divergences information uncertainty},
note = {cite arxiv:2004.00715Comment: To appear in AISTATS 2020},
timestamp = {2020-04-03T16:41:55.000+0200},
title = {An approximate {KLD} based experimental design for models with
intractable likelihoods},
url = {http://arxiv.org/abs/2004.00715},
year = 2020
}