Convergence Analysis Of Block Coordinate Algorithms With Determinantal Sampling

INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108(2020)

引用 18|浏览21
暂无评分
摘要
We analyze the convergence rate of the randomized Newton-like method introduced by Chu et al. (2016) for smooth and convex objectives, which uses random coordinate blocks of a Hessian-over-approximation matrix M instead of the true Hessian. The convergence analysis of the algorithm is challenging because of its complex dependence on the structure of M. However, we show that when the coordinate blocks are sampled with probability proportional to their determinant, the convergence rate depends solely on the eigenvalue distribution of matrix M, and has an analytically tractable form. To do so, we derive a fundamental new expectation formula for determinantal point processes. We show that determinantal sampling allows us to reason about the optimal subset size of blocks in terms of the spectrum of M. Additionally, we provide a numerical evaluation of our analysis, demonstrating cases where determinantal sampling is superior or on par with uniform sampling.
更多
查看译文
关键词
block coordinate algorithms,determinantal sampling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要