Bandits with budgets: Regret lower bounds and optimal algorithms

Richard Combes, Chong Jiang, Rayadurgam Srikant

Research output: Contribution to journalConference articlepeer-review

Abstract

We investigate multi-armed bandits with budgets, a natural model for ad-display optimization encountered in search engines. We provide asymptotic regret lower bounds satisfied by any algorithm, and propose algorithms which match those lower bounds. We consider different types of budgets: scenarios where the advertiser has a fixed budget over a time horizon, and scenarios where the amount of money that is available to spend is incremented in each time slot. Further, we consider two different pricing models, one in which an advertiser is charged for each time her ad is shown (i.e., for each impression) and one in which the advertiser is charged only if a user clicks on the ad. For all of these cases, we show that it is possible to achieve O(log(T)) regret. For both the cost-per-impression and cost-per-click models, with a fixed budget, we provide regret lower bounds that apply to any uniformly good algorithm. Further, we show that B-KL-UCB, a natural variant of KL-UCB, is asymptotically optimal for these cases. Numerical experiments (based on a real-world data set) further suggest that B-KL-UCB also has the same or better finite-time performance when compared to various previously proposed (UCB-like) algorithms, which is important when applying such algorithms to a real-world problem.

Original languageEnglish (US)
Pages (from-to)245-257
Number of pages13
JournalPerformance Evaluation Review
Volume43
Issue number1
DOIs
StatePublished - Jun 24 2015
EventACM SIGMETRICS International Conference on Measurement and Modeling of Computer Systems, SIGMETRICS 2015 - Portland, United States
Duration: Jun 15 2015Jun 19 2015

Keywords

  • Ad-display optimization
  • Budgets
  • KL-UCB
  • Learning
  • Multi-armed bandits
  • Search engines
  • UCB

ASJC Scopus subject areas

  • Software
  • Hardware and Architecture
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Bandits with budgets: Regret lower bounds and optimal algorithms'. Together they form a unique fingerprint.

Cite this