Incentivizing exploration

Proceedings of the fifteenth ACM conference on Economics and computation, pp. 5-22, 2014.

Cited by: 62|Bibtex|Views41|Links
EI
Keywords:
crowdsourcingeconomicsexplorationincentivesmulti-armed bandit problems

Abstract:

We study a Bayesian multi-armed bandit (MAB) setting in which a principal seeks to maximize the sum of expected time-discounted rewards obtained by pulling arms, when the arms are actually pulled by selfish and myopic individuals. Since such individuals pull the arm with highest expected posterior reward (i.e., they always exploit and nev...More

Code:

Data:

Your rating :
0

 

Tags
Comments