菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-03
📄 Abstract - Riemannian Neural Optimal Transport

Computational optimal transport (OT) offers a principled framework for generative modeling. Neural OT methods, which use neural networks to learn an OT map (or potential) from data in an amortized way, can be evaluated out of sample after training, but existing approaches are tailored to Euclidean geometry. Extending neural OT to high-dimensional Riemannian manifolds remains an open challenge. In this paper, we prove that any method for OT on manifolds that produces discrete approximations of transport maps necessarily suffers from the curse of dimensionality: achieving a fixed accuracy requires a number of parameters that grows exponentially with the manifold dimension. Motivated by this limitation, we introduce Riemannian Neural OT (RNOT) maps, which are continuous neural-network parameterizations of OT maps on manifolds that avoid discretization and incorporate geometric structure by construction. Under mild regularity assumptions, we prove that RNOT maps approximate Riemannian OT maps with sub-exponential complexity in the dimension. Experiments on synthetic and real datasets demonstrate improved scalability and competitive performance relative to discretization-based baselines.

顶级标签: theory machine learning model training
详细标签: optimal transport riemannian manifolds neural networks generative modeling curse of dimensionality 或 搜索:

黎曼神经最优传输 / Riemannian Neural Optimal Transport


1️⃣ 一句话总结

这篇论文提出了一种名为RNOT的新方法,它利用神经网络直接在复杂的曲面(黎曼流形)上学习最优传输映射,避免了传统离散化方法在高维空间中参数数量爆炸的问题,从而更高效地实现数据分布间的转换。

源自 arXiv: 2602.03566