Integrating ML models in software is of growing interest. Building accurate
models requires right choice of hyperparameters for training procedures
(learners), when the training dataset is given. AutoML tools provide APIs to
automate the choice, which usually involve many trials of different
hyperparameters for a given training dataset. Since training and evaluation of
complex models can be time and resource consuming, existing AutoML solutions
require long time or large resource to produce accurate models for large scale
training data. That prevents AutoML to be embedded in a software which needs to
repeatedly tune hyperparameters and produce models to be consumed by other
components, such as large-scale data systems. We present a fast and lightweight
hyperparameter optimization method FLO and use it to build an efficient AutoML
solution. Our method optimizes for minimal evaluation cost instead of number of
iterations to find accurate models. Our main idea is to leverage a holistic
consideration of the relations among model complexity, evaluation cost and
accuracy. FLO has a strong anytime performance and significantly outperforms
Bayesian Optimization and random search for hyperparameter tuning on a large
open source AutoML Benchmark. Our AutoML solution also outperforms top-ranked
AutoML libraries in a majority of the tasks on this benchmark.
Description
[1911.04706] FLO: Fast and Lightweight Hyperparameter Optimization for AutoML
%0 Generic
%1 wang2019lightweight
%A Wang, Chi
%A Wu, Qingyun
%D 2019
%K hyperparameter machinelearning
%T FLO: Fast and Lightweight Hyperparameter Optimization for AutoML
%U http://arxiv.org/abs/1911.04706
%X Integrating ML models in software is of growing interest. Building accurate
models requires right choice of hyperparameters for training procedures
(learners), when the training dataset is given. AutoML tools provide APIs to
automate the choice, which usually involve many trials of different
hyperparameters for a given training dataset. Since training and evaluation of
complex models can be time and resource consuming, existing AutoML solutions
require long time or large resource to produce accurate models for large scale
training data. That prevents AutoML to be embedded in a software which needs to
repeatedly tune hyperparameters and produce models to be consumed by other
components, such as large-scale data systems. We present a fast and lightweight
hyperparameter optimization method FLO and use it to build an efficient AutoML
solution. Our method optimizes for minimal evaluation cost instead of number of
iterations to find accurate models. Our main idea is to leverage a holistic
consideration of the relations among model complexity, evaluation cost and
accuracy. FLO has a strong anytime performance and significantly outperforms
Bayesian Optimization and random search for hyperparameter tuning on a large
open source AutoML Benchmark. Our AutoML solution also outperforms top-ranked
AutoML libraries in a majority of the tasks on this benchmark.
@misc{wang2019lightweight,
abstract = {Integrating ML models in software is of growing interest. Building accurate
models requires right choice of hyperparameters for training procedures
(learners), when the training dataset is given. AutoML tools provide APIs to
automate the choice, which usually involve many trials of different
hyperparameters for a given training dataset. Since training and evaluation of
complex models can be time and resource consuming, existing AutoML solutions
require long time or large resource to produce accurate models for large scale
training data. That prevents AutoML to be embedded in a software which needs to
repeatedly tune hyperparameters and produce models to be consumed by other
components, such as large-scale data systems. We present a fast and lightweight
hyperparameter optimization method FLO and use it to build an efficient AutoML
solution. Our method optimizes for minimal evaluation cost instead of number of
iterations to find accurate models. Our main idea is to leverage a holistic
consideration of the relations among model complexity, evaluation cost and
accuracy. FLO has a strong anytime performance and significantly outperforms
Bayesian Optimization and random search for hyperparameter tuning on a large
open source AutoML Benchmark. Our AutoML solution also outperforms top-ranked
AutoML libraries in a majority of the tasks on this benchmark.},
added-at = {2019-11-13T18:07:52.000+0100},
author = {Wang, Chi and Wu, Qingyun},
biburl = {https://www.bibsonomy.org/bibtex/20c1c22a7ef3dd6b188851f286495ad1b/cpankow},
description = {[1911.04706] FLO: Fast and Lightweight Hyperparameter Optimization for AutoML},
interhash = {e39eac349b0ddd3de7d669224d94a26b},
intrahash = {0c1c22a7ef3dd6b188851f286495ad1b},
keywords = {hyperparameter machinelearning},
note = {cite arxiv:1911.04706},
timestamp = {2019-11-13T18:07:52.000+0100},
title = {FLO: Fast and Lightweight Hyperparameter Optimization for AutoML},
url = {http://arxiv.org/abs/1911.04706},
year = 2019
}