Finding low-rank solutions to smooth convex problems via the Burer-Monteiro approach.

2016 54TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON)(2016)

引用 25|浏览16
暂无评分
摘要
A rank-r matrix X is an element of R-m X n can be written as a product UVT, where U is an element of R-m X r and V is an element of R-n X r. One could exploit this observation in optimization: e.g., consider the minimization of a convex function f (X) over rank-r matrices, where the set of rank-r matrices is modeled via the factorization in U and V variables. Such heuristic has been widely used before for problem instances, where the solution is (approximately) low-rank. Though such parameterization reduces the number of variables and is more efficient w.r.t. computational and memory requirements (of particular interest is the case r << min { m, n }), it comes at a cost: f (UVT) becomes a non-convex function w.r.t. U and V. In this paper, we study such parameterization in optimizing generic smooth convex f, that has Lipschitz continuous gradients, and focus on first-order, gradient descent algorithmic solutions. We propose the Bi-Factored Gradient Descent (BFGD) algorithm, an efficient first-order method that operates on the U, V factors. We show that when f is smooth and BFGD is initialized properly, it has local sublinear convergence to a globally optimum point. As a test case, we consider the 1-bit matrix completion problem: We compare BFGD with state-of-the-art approaches and show that it has at least competitive test error performance on real dataset experiments, while being faster in execution, as compared to the rest of the algorithms. We conclude this work with some remarks and open questions for further investigations.
更多
查看译文
关键词
low-rank solutions,smooth convex problems,Burer-Monteiro approach,rank-r matrix,optimization,minimization,convex function,bi-factored gradient descent algorithm,BFGD algorithm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要