基于分数匹配的扩散模型在本质低维数据上的泛化性质 / Generalization Properties of Score-matching Diffusion Models for Intrinsically Low-dimensional Data
1️⃣ 一句话总结
这篇论文证明了基于分数匹配的扩散模型在学习具有内在低维结构的数据分布时,其生成误差的收敛速度仅依赖于数据的内在维度而非环境维度,从而在理论上解释了其为何能有效缓解维度灾难并适应数据的真实几何结构。
Despite the remarkable empirical success of score-based diffusion models, their statistical guarantees remain underdeveloped. Existing analyses often provide pessimistic convergence rates that do not reflect the intrinsic low-dimensional structure common in real data, such as that arising in natural images. In this work, we study the statistical convergence of score-based diffusion models for learning an unknown distribution $\mu$ from finitely many samples. Under mild regularity conditions on the forward diffusion process and the data distribution, we derive finite-sample error bounds on the learned generative distribution, measured in the Wasserstein-$p$ distance. Unlike prior results, our guarantees hold for all $p \ge 1$ and require only a finite-moment assumption on $\mu$, without compact-support, manifold, or smooth-density conditions. Specifically, given $n$ i.i.d.\ samples from $\mu$ with finite $q$-th moment and appropriately chosen network architectures, hyperparameters, and discretization schemes, we show that the expected Wasserstein-$p$ error between the learned distribution $\hat{\mu}$ and $\mu$ scales as $\mathbb{E}\, \mathbb{W}_p(\hat{\mu},\mu) = \widetilde{O}\!\left(n^{-1 / d^\ast_{p,q}(\mu)}\right),$ where $d^\ast_{p,q}(\mu)$ is the $(p,q)$-Wasserstein dimension of $\mu$. Our results demonstrate that diffusion models naturally adapt to the intrinsic geometry of data and mitigate the curse of dimensionality, since the convergence rate depends on $d^\ast_{p,q}(\mu)$ rather than the ambient dimension. Moreover, our theory conceptually bridges the analysis of diffusion models with that of GANs and the sharp minimax rates established in optimal transport. The proposed $(p,q)$-Wasserstein dimension also extends classical Wasserstein dimension notions to distributions with unbounded support, which may be of independent theoretical interest.
基于分数匹配的扩散模型在本质低维数据上的泛化性质 / Generalization Properties of Score-matching Diffusion Models for Intrinsically Low-dimensional Data
这篇论文证明了基于分数匹配的扩散模型在学习具有内在低维结构的数据分布时,其生成误差的收敛速度仅依赖于数据的内在维度而非环境维度,从而在理论上解释了其为何能有效缓解维度灾难并适应数据的真实几何结构。
源自 arXiv: 2603.03700