菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-01-28
📄 Abstract - Signal from Structure: Exploiting Submodular Upper Bounds in Generative Flow Networks

Generative Flow Networks (GFlowNets; GFNs) are a class of generative models that learn to sample compositional objects proportionally to their a priori unknown value, their reward. We focus on the case where the reward has a specified, actionable structure, namely that it is submodular. We show submodularity can be harnessed to retrieve upper bounds on the reward of compositional objects that have not yet been observed. We provide in-depth analyses of the probability of such bounds occurring, as well as how many unobserved compositional objects can be covered by a bound. Following the Optimism in the Face of Uncertainty principle, we then introduce SUBo-GFN, which uses the submodular upper bounds to train a GFN. We show that SUBo-GFN generates orders of magnitude more training data than classical GFNs for the same number of queries to the reward function. We demonstrate the effectiveness of SUBo-GFN in terms of distribution matching and high-quality candidate generation on synthetic and real-world submodular tasks.

顶级标签: theory model training machine learning
详细标签: generative flow networks submodular optimization optimism in face of uncertainty compositional objects reward upper bounds 或 搜索:

从结构中提取信号:在生成流网络中利用子模上界 / Signal from Structure: Exploiting Submodular Upper Bounds in Generative Flow Networks


1️⃣ 一句话总结

这篇论文提出了一种名为SUBo-GFN的新方法,它利用奖励函数具有‘子模性’这一特殊数学结构来提前估算未知组合对象的潜在价值,从而在训练生成流网络时能更高效地生成大量高质量的训练数据,大幅减少对真实奖励函数的查询次数。

源自 arXiv: 2601.21061