Bandits and Experts in Metric Spaces
Journal of the ACM (JACM), pp. 1-77, 2019.
EI
Abstract:
In a multi-armed bandit problem, an online algorithm chooses from a set of strategies in a sequence of trials to maximize the total payoff of the chosen strategies. While the performance of bandit ...
Code:
Data:
Full Text
Tags
Comments