The Convergence Rates of Blockchain Mining Games: A Markovian Approach

arxiv(2021)

引用 0|浏览0
暂无评分
摘要
Understanding the strategic behavior of miners in a blockchain is of great importance for its proper operation. A common model for mining games considers an infinite time horizon, with players optimizing asymptotic average objectives. Implicitly, this assumes that the asymptotic behaviors are realized at human-scale times, otherwise invalidating current models. We study the mining game utilizing Markov Decision Processes. Our approach allows us to describe the asymptotic behavior of the game in terms of the stationary distribution of the induced Markov chain. We focus on a model with two players under immediate release, assuming two different objectives: the (asymptotic) average reward per turn and the (asymptotic) percentage of obtained blocks. Using tools from Markov chain analysis, we show the existence of a strategy achieving slow mixing times, exponential in the policy parameters. This result emphasizes the imperative need to understand convergence rates in mining games, validating the standard models. Towards this end, we provide upper bounds for the mixing time of certain meaningful classes of strategies. This result yields criteria for establishing that long-term averaged functions are coherent as payoff functions. Moreover, by studying hitting times, we provide a criterion to validate the common simplification of considering finite states models. For both considered objectives functions, we provide explicit formulae depending on the stationary distribution of the underlying Markov chain. In particular, this shows that both mentioned objectives are not equivalent. Finally, we perform a market share case study in a particular regime of the game. More precisely, we show that an strategic player with a sufficiently large processing power can impose negative revenue on honest players.
更多
查看译文
关键词
blockchain mining games,markovian approach,convergence rates
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要