A General Framework for Multi-fidelity Bayesian Optimization with Gaussian Processes
J. Song, Y. Chen, and Y. Yue. Proceedings of Machine Learning Research, volume 89 of Proceedings of Machine Learning Research, page 3158--3167. PMLR, (16--18 Apr 2019)
Abstract
How can we efficiently gather information to optimize an unknown function, when presented with multiple, mutually dependent information sources with different costs? For example, when optimizing a physical system, intelligently trading off computer simulations and real-world tests can lead to significant savings. Existing multi-fidelity Bayesian optimization methods, such as multi-fidelity GP-UCB or Entropy Search-based approaches, either make simplistic assumptions on the interaction among different fidelities or use simple heuristics that lack theoretical guarantees. In this paper, we study multi-fidelity Bayesian optimization with complex structural dependencies among multiple outputs, and propose MF-MI-Greedy, a principled algorithmic framework for addressing this problem. In particular, we model different fidelities using additive Gaussian processes based on shared latent relationships with the target function. Then we use cost-sensitive mutual information gain for efficient Bayesian optimization. We propose a simple notion of regret which incorporates the varying cost of different fidelities, and prove that MF-MI-Greedy achieves low regret. We demonstrate the strong empirical performance of our algorithm on both synthetic and real-world datasets.
Description
A General Framework for Multi-fidelity Bayesian Optimization with Gaussian Processes
%0 Conference Paper
%1 pmlr-v89-song19b
%A Song, Jialin
%A Chen, Yuxin
%A Yue, Yisong
%B Proceedings of Machine Learning Research
%D 2019
%E Chaudhuri, Kamalika
%E Sugiyama, Masashi
%I PMLR
%K bayesian gaussian-proceses
%P 3158--3167
%T A General Framework for Multi-fidelity Bayesian Optimization with Gaussian Processes
%U http://proceedings.mlr.press/v89/song19b.html
%V 89
%X How can we efficiently gather information to optimize an unknown function, when presented with multiple, mutually dependent information sources with different costs? For example, when optimizing a physical system, intelligently trading off computer simulations and real-world tests can lead to significant savings. Existing multi-fidelity Bayesian optimization methods, such as multi-fidelity GP-UCB or Entropy Search-based approaches, either make simplistic assumptions on the interaction among different fidelities or use simple heuristics that lack theoretical guarantees. In this paper, we study multi-fidelity Bayesian optimization with complex structural dependencies among multiple outputs, and propose MF-MI-Greedy, a principled algorithmic framework for addressing this problem. In particular, we model different fidelities using additive Gaussian processes based on shared latent relationships with the target function. Then we use cost-sensitive mutual information gain for efficient Bayesian optimization. We propose a simple notion of regret which incorporates the varying cost of different fidelities, and prove that MF-MI-Greedy achieves low regret. We demonstrate the strong empirical performance of our algorithm on both synthetic and real-world datasets.
@inproceedings{pmlr-v89-song19b,
abstract = {How can we efficiently gather information to optimize an unknown function, when presented with multiple, mutually dependent information sources with different costs? For example, when optimizing a physical system, intelligently trading off computer simulations and real-world tests can lead to significant savings. Existing multi-fidelity Bayesian optimization methods, such as multi-fidelity GP-UCB or Entropy Search-based approaches, either make simplistic assumptions on the interaction among different fidelities or use simple heuristics that lack theoretical guarantees. In this paper, we study multi-fidelity Bayesian optimization with complex structural dependencies among multiple outputs, and propose MF-MI-Greedy, a principled algorithmic framework for addressing this problem. In particular, we model different fidelities using additive Gaussian processes based on shared latent relationships with the target function. Then we use cost-sensitive mutual information gain for efficient Bayesian optimization. We propose a simple notion of regret which incorporates the varying cost of different fidelities, and prove that MF-MI-Greedy achieves low regret. We demonstrate the strong empirical performance of our algorithm on both synthetic and real-world datasets.},
added-at = {2019-12-11T13:50:01.000+0100},
author = {Song, Jialin and Chen, Yuxin and Yue, Yisong},
biburl = {https://www.bibsonomy.org/bibtex/2cbbee2dcba4e9c0ddeeedf516e32a96b/kirk86},
booktitle = {Proceedings of Machine Learning Research},
description = {A General Framework for Multi-fidelity Bayesian Optimization with Gaussian Processes},
editor = {Chaudhuri, Kamalika and Sugiyama, Masashi},
interhash = {f06e48cbb432ebec20844588559859d9},
intrahash = {cbbee2dcba4e9c0ddeeedf516e32a96b},
keywords = {bayesian gaussian-proceses},
month = {16--18 Apr},
pages = {3158--3167},
pdf = {http://proceedings.mlr.press/v89/song19b/song19b.pdf},
publisher = {PMLR},
series = {Proceedings of Machine Learning Research},
timestamp = {2019-12-11T13:50:01.000+0100},
title = {A General Framework for Multi-fidelity Bayesian Optimization with Gaussian Processes},
url = {http://proceedings.mlr.press/v89/song19b.html},
volume = 89,
year = 2019
}