@kirk86

A General Framework for Multi-fidelity Bayesian Optimization with Gaussian Processes

, , and . Proceedings of Machine Learning Research, volume 89 of Proceedings of Machine Learning Research, page 3158--3167. PMLR, (16--18 Apr 2019)

Abstract

How can we efficiently gather information to optimize an unknown function, when presented with multiple, mutually dependent information sources with different costs? For example, when optimizing a physical system, intelligently trading off computer simulations and real-world tests can lead to significant savings. Existing multi-fidelity Bayesian optimization methods, such as multi-fidelity GP-UCB or Entropy Search-based approaches, either make simplistic assumptions on the interaction among different fidelities or use simple heuristics that lack theoretical guarantees. In this paper, we study multi-fidelity Bayesian optimization with complex structural dependencies among multiple outputs, and propose MF-MI-Greedy, a principled algorithmic framework for addressing this problem. In particular, we model different fidelities using additive Gaussian processes based on shared latent relationships with the target function. Then we use cost-sensitive mutual information gain for efficient Bayesian optimization. We propose a simple notion of regret which incorporates the varying cost of different fidelities, and prove that MF-MI-Greedy achieves low regret. We demonstrate the strong empirical performance of our algorithm on both synthetic and real-world datasets.

Description

A General Framework for Multi-fidelity Bayesian Optimization with Gaussian Processes

Links and resources

Tags

community

  • @kirk86
  • @dblp
@kirk86's tags highlighted