Exploiting correlation and budget constraints in Bayesian multi−armed bandit optimization
Matthew W. Hoffman‚ Bobak Shahriari and Nando de Freitas
Abstract
We study the effect of taking correlation and budget constraints into consideration in practical Bayesian optimization tasks, such as active sensing and automated machine learning. We compare a large number of techniques from the bandits, experimental design and global optimization literature, including Thompson sampling, expected improvement (EI), probability of improvement (PI), Bayesian upper confidence bounds (BayesUCB and GPUCB). We also consider approaches specifically designed to take into account fixed budget constraints such as UBCE and gap-based methods. The latter methods perform worse than methods that take correlation into account in our fixed-budget settings. To remedy this, we introduce a novel adaptive Bayesian gap-based exploration method that simultaneously capitalizes on knowledge of the budget and correlation among the arms. The method outperforms the other techniques on a sensor network task, and on the domain of automatic machine learning technique selection and tuning.
Details
Institution |
University of Oxford |
Number |
arXiv:1303.6746v2 |
Year |
2013 |
Links
Related pages
People |
|
Projects |