ePrints@IIScePrints@IISc Home | About | Browse | Latest Additions | Advanced Search | Contact | Help

Bayesian optimization under heavy-tailed payoffs

Chowdhury, SR and Gopalan, A (2019) Bayesian optimization under heavy-tailed payoffs. In: 33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019, 8-14 December 2019, Vancouver; Canada.

[img] PDF
adv_neu_inf_pro_sys_32_2019.pdf - Published Version
Restricted to Registered users only

Download (567kB) | Request a copy
Official URL: https://papers.nips.cc/paper/2019


We consider black box optimization of an unknown function in the nonparametric Gaussian process setting when the noise in the observed function values can be heavy tailed. This is in contrast to existing literature that typically assumes sub-Gaussian noise distributions for queries. Under the assumption that the unknown function belongs to the Reproducing Kernel Hilbert Space (RKHS) induced by a kernel, we first show that an adaptation of the well-known GP-UCB algorithm with reward truncation enjoys sublinear � (T 2(1+ 2+a a)) regret even with only the (1 + a)-th moments, a ? (0, 1], of the reward distribution being bounded (� hides logarithmic factors). However, for the common squared exponential (SE) and Matérn kernels, this is seen to be significantly larger than a fundamental ?(T 1+ 1 a ) lower bound on regret. We resolve this gap by developing novel Bayesian optimization algorithms, based on kernel approximation techniques, with regret bounds matching the lower bound in order for the SE kernel. We numerically benchmark the algorithms on environments based on both synthetic models and real-world data sets. © 2019 Neural information processing systems foundation. All rights reserved.

Item Type: Conference Paper
Publication: Advances in Neural Information Processing Systems
Publisher: Neural information processing systems foundation
Additional Information: The copyright of this article belongs to Neural information processing systems foundation
Keywords: Gaussian noise (electronic); Optimization, Bayesian optimization; Bayesian optimization algorithms; Black-box optimization; Function values; Gaussian Processes; Kernel approximation; Reproducing Kernel Hilbert spaces; Synthetic models, Approximation algorithms
Department/Centre: Division of Electrical Sciences > Electrical Communication Engineering
Date Deposited: 22 Sep 2020 07:40
Last Modified: 28 Aug 2022 10:21
URI: https://eprints.iisc.ac.in/id/eprint/66564

Actions (login required)

View Item View Item