Conditional Independence of 1D Gibbs States with Applications to Efficient Learning

arxiv(2024)

引用 0|浏览0
暂无评分
摘要
We show that spin chains in thermal equilibrium have a correlation structure in which individual regions are strongly correlated at most with their near vicinity. We quantify this with alternative notions of the conditional mutual information defined through the so-called Belavkin-Staszewski relative entropy. We prove that these measures decay super-exponentially, under the assumption that the spin chain Hamiltonian is translation-invariant. Using a recovery map associated with these measures, we sequentially construct tensor network approximations in terms of marginals of small (sub-logarithmic) size. As a main application, we show that classical representations of the states can be learned efficiently from local measurements with a polynomial sample complexity. We also prove an approximate factorization condition for the purity of the entire Gibbs state, which implies that it can be efficiently estimated to a small multiplicative error from a small number of local measurements. As a technical step of independent interest, we show an upper bound to the decay of the Belavkin-Staszewski relative entropy upon the application of a conditional expectation.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要