Bandits and Experts in Metric Spaces

Journal of the ACM (JACM), pp. 1-77, 2019.

Cited by: 405|Bibtex|Views73|DOI:https://doi.org/10.1145/3299873
EI
Other Links: dblp.uni-trier.de|academic.microsoft.com|dl.acm.org|arxiv.org

Abstract:

In a multi-armed bandit problem, an online algorithm chooses from a set of strategies in a sequence of trials to maximize the total payoff of the chosen strategies. While the performance of bandit ...

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments