菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-03-03
📄 Abstract - Infinite dimensional generative sensing

Deep generative models have become a standard for modeling priors for inverse problems, going beyond classical sparsity-based methods. However, existing theoretical guarantees are mostly confined to finite-dimensional vector spaces, creating a gap when the physical signals are modeled as functions in Hilbert spaces. This work presents a rigorous framework for generative compressed sensing in Hilbert spaces. We extend the notion of local coherence in an infinite-dimensional setting, to derive optimal, resolution-independent sampling distributions. Thanks to a generalization of the Restricted Isometry Property, we show that stable recovery holds when the number of measurements is proportional to the prior's intrinsic dimension (up to logarithmic factors), independent of the ambient dimension. Finally, numerical experiments on the Darcy flow equation validate our theoretical findings and demonstrate that in severely undersampled regimes, employing lower-resolution generators acts as an implicit regularizer, improving reconstruction stability.

顶级标签: theory machine learning model training
详细标签: generative compressed sensing inverse problems infinite-dimensional restricted isometry property implicit regularization 或 搜索:

无限维生成感知 / Infinite dimensional generative sensing


1️⃣ 一句话总结

这篇论文为无限维希尔伯特空间中的信号重建建立了一个理论框架,证明了使用深度生成模型作为先验时,所需的测量次数仅与模型的内在维度相关,而与信号本身的高维环境无关,从而在严重欠采样条件下实现了稳定的重建。

源自 arXiv: 2603.03196