Almost Optimal Exploration in Multi-Armed Bandits
Almost Optimal Exploration in Multi-Armed Bandits
Zohar Karnin, Tomer Koren, and Oren Somekh, 2013
Download
Abstract
(unavailable)
BibTeX Entry
@InProceedings{Karnin+KS:2013, author = "Karnin, Zohar and Koren, Tomer and Oren Somekh", title = "Almost Optimal Exploration in Multi-Armed Bandits", booktitle = "Proceedings of the Thirtieth International Conference on Machine Learning (ICML 2013)", year = "2013", volume = "28", series = "JMLR Workshop and Conference Proceedings", publisher = "JMLR", pages = "1238--1246", url = "http://jmlr.org/proceedings/papers/v28/karnin13.pdf", bib2html_rescat = "Bandits", }