QUT ePrints

Closing the gap between bandit and full-information online optimization : high-probability regret bound

Rakhlin, Alexander , Tewari, Ambuj , & Bartlett, Peter L. (2007) Closing the gap between bandit and full-information online optimization : high-probability regret bound. Technical Report, UCB/EECS-2007-109. University of California, Berkeley, California (USA).

View at publisher

Abstract

We demonstrate a modification of the algorithm of Dani et al for the online linear optimization problem in the bandit setting, which allows us to achieve an O( \sqrt{T ln T} ) regret bound in high probability against an adaptive adversary, as opposed to the in expectation result against an oblivious adversary of Dani et al. We obtain the same dependence on the dimension as that exhibited by Dani et al. The results of this paper rest firmly on those of Dani et al and the remarkable technique of Auer et al for obtaining high-probability bounds via optimistic estimates. This paper answers an open question: it eliminates the gap between the high-probability bounds obtained in the full-information vs bandit settings.

Impact and interest:

Citation countsare sourced monthly from Scopus and Web of Science® citation databases.

These databases contain citations from different subsets of available publications and different time periods and thus the citation count from each is usually different. Some works are not in either database and no count is displayed. Scopus includes citations from articles published in 1996 onwards, and Web of Science® generally from 1980 onwards.

Citations counts from the Google Scholar™ indexing service can be viewed at the linked Google Scholar™ search.

ID Code: 44021
Item Type: Report
Additional URLs:
Keywords: OAVJ
Divisions: Past > QUT Faculties & Divisions > Faculty of Science and Technology
Past > Schools > Mathematical Sciences
Copyright Owner: Copyright 2007 please consult the authors
Deposited On: 18 Aug 2011 09:54
Last Modified: 18 Aug 2011 09:54

Export: EndNote | Dublin Core | BibTeX

Repository Staff Only: item control page