菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-05-11
📄 Abstract - Kernel-Gradient Drifting Models

We propose kernel-gradient drifting, a one-step generative modeling framework that replaces the fixed Euclidean displacement direction in drifting models with directions induced by the kernel itself. Standard drifting is attractive because it enables fast, high-quality generation without distilling a large pretrained diffusion model, but its theory is currently understood mainly for Gaussian kernels, where the drift coincides with smoothed score matching and is identifiable. Our gradient-based reformulation exposes this score-based structure for general kernels: the resulting drift is the score difference between kernel-smoothed data and model distributions, yielding identifiability for characteristic kernels and a smoothed-KL descent interpretation of the drifting dynamics. Since kernel gradients are intrinsic tangent vectors, the same construction extends naturally to Riemannian manifolds and to discrete data via the Fisher-Rao geometry of the probability simplex. Across spherical geospatial data, promoter DNA and molecule generation, kernel-gradient drifting enables state-of-the-art one-step generation beyond the Euclidean setting without distillation.

顶级标签: machine learning
详细标签: generative modeling score matching kernel methods one-step generation riemannian manifolds 或 搜索:

核梯度漂移模型 / Kernel-Gradient Drifting Models


1️⃣ 一句话总结

本文提出了一种名为核梯度漂移的一步生成建模新方法,通过用核函数自身诱导的方向替换传统漂移模型中的固定欧几里得位移,使得该方法不仅能保留快速高质量生成的优势,还能自然地推广到球面地理数据、DNA序列和分子生成等非欧几里得场景,无需依赖大规模预训练扩散模型的蒸馏过程。

源自 arXiv: 2605.10727