Scoring positive semidefinite cutting planes for quadratic optimization via trained neural networks

semanticscholar(2019)

引用 5|浏览4
暂无评分
摘要
Semidefinite programming relaxations complement polyhedral relaxations for quadratic optimization, but global optimization solvers built on polyhedral relaxations cannot fully exploit this advantage. This paper develops linear outer-approximations of semidefinite constraints that can be effectively integrated into global solvers. The difference from previous work is that our proposed cuts are (i) sparser with respect to the number of nonzeros in the row and (ii) explicitly selected to improve the objective. We select which cuts to generate using objective structure and explore engineering trade-offs for sparsity patterns, e.g. cut dimensionality and chordal extensions. A neural network estimator is key to our cut selection strategy: ranking each cut based on objective improvement involves solving a semidefinite optimization problem, but this is an expensive proposition at each Branch&Cut node. The neural network estimator, trained a priori of any instance to solve, takes the computation offline by predicting the objective improvement for any cut. Most of this paper discusses quadratic programming with box constraints as a test bed for the relevant engineering trade-offs. We also show how the engineering decisions may immediately extend to quadratically constrained instances, particularly ones with rich quadratic objective structure.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要