Fast metric embedding into the hamming cube

SIAM JOURNAL ON COMPUTING(2024)

引用 0|浏览4
暂无评分
摘要
We consider the problem of embedding a subset of Rn into a low -dimensional Hamming cube in an almost isometric way. We construct a simple, data -oblivious, and computationally efficient map that achieves this task with high probability; we first apply a specific structured random matrix, which we call the double circulant matrix; using that a matrix requires linear storage and matrix -vector multiplication that can be performed in near -linear time. We then binarize each vector by comparing each of its entries to a random threshold, selected uniformly at random from a well-chosen interval. We estimate the number of bits required for this encoding scheme in terms of two natural geometric complexity parameters of the set: its Euclidean covering numbers and its localized Gaussian complexity. The estimate we derive turns out to be the best that one can hope for, up to logarithmic terms. The key to the proof is a phenomenon of independent interest: we show that the double circulant matrix mimics the behavior of the Gaussian matrix in two important ways. First, it maps an arbitrary set in Rn into a set of well -spread vectors. Second, it yields a fast nearisometric embedding of any finite subset of \ell2n into \ell1m. This embedding achieves the same dimension reduction as the Gaussian matrix in near -linear time, under an optimal condition ---up to logarithmic factors ---on the number of points to be embedded. This improves a well-known construction due to Ailon and Chazelle.
更多
查看译文
关键词
dimension reduction,Johnson-Lindenstrauss embeddings,Hamming cube,circulant matrices,Gaussian width
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要