Approximation Theory, Computing, and Deep Learning on the Wasserstein Space
arxiv(2023)
摘要
The challenge of approximating functions in infinite-dimensional spaces from
finite samples is widely regarded as formidable. In this study, we delve into
the challenging problem of the numerical approximation of Sobolev-smooth
functions defined on probability spaces. Our particular focus centers on the
Wasserstein distance function, which serves as a relevant example. In contrast
to the existing body of literature focused on approximating efficiently
pointwise evaluations, we chart a new course to define functional approximants
by adopting three machine learning-based approaches: 1. Solving a finite number
of optimal transport problems and computing the corresponding Wasserstein
potentials. 2. Employing empirical risk minimization with Tikhonov
regularization in Wasserstein Sobolev spaces. 3. Addressing the problem through
the saddle point formulation that characterizes the weak form of the Tikhonov
functional's Euler-Lagrange equation. As a theoretical contribution, we furnish
explicit and quantitative bounds on generalization errors for each of these
solutions. In the proofs, we leverage the theory of metric Sobolev spaces and
we combine it with techniques of optimal transport, variational calculus, and
large deviation bounds. In our numerical implementation, we harness
appropriately designed neural networks to serve as basis functions. These
networks undergo training using diverse methodologies. This approach allows us
to obtain approximating functions that can be rapidly evaluated after training.
Consequently, our constructive solutions significantly enhance at equal
accuracy the evaluation speed, surpassing that of state-of-the-art methods by
several orders of magnitude.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要