Balancing Vectors in Any Norm

2018 IEEE 59th Annual Symposium on Foundations of Computer Science (FOCS)(2018)

引用 20|浏览152
暂无评分
摘要
In the vector balancing problem, we are given symmetric convex bodies C and K in R^n, and our goal is to determine the minimum number β ≥ 0, known as the vector balancing constant from C to K, such that for any sequence of vectors in C there always exists a signed combination of them lying inside β K. Many fundamental results in discrepancy theory, such as the Beck-Fiala theorem (Discrete Appl.~Math '81), Spencer's "six standard deviations suffice" theorem (Trans.~Amer.~Math.~Soc '85) and Banaszczyk's vector balancing theorem (Random Structures & Algorithms '98) correspond to bounds on vector balancing constants. The above theorems have inspired much research in recent years within theoretical computer science. In this work, we show that all vector balancing constants admit "good" approximate characterizations, with approximation factors depending only polylogarithmically on the dimension n. First, we show that a volumetric lower bound due to Banaszczyk is tight within a O(log n) factor. Our proof is algorithmic, and we show that Rothvoss's (FOCS '14) partial coloring algorithm can be analyzed to obtain these guarantees. Second, we present a novel convex program which encodes the "best possible way" to apply Banaszczyk's vector balancing theorem for bounding vector balancing constants from above, and show that it is tight within an O(log^2.5 n) factor. This also directly yields a corresponding polynomial time approximation algorithm both for vector balancing constants, and for the hereditary discrepancy of any sequence of vectors with respect to an arbitrary norm.
更多
查看译文
关键词
Discrepancy,Convex Geometry,Gaussian measure,M-ellipsoid,K-convexity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要