📄
Abstract - On the Interplay of Priors and Overparametrization in Bayesian Neural Network Posteriors
Bayesian neural network (BNN) posteriors are often considered impractical for inference, as symmetries fragment them, non-identifiabilities inflate dimensionality, and weight-space priors are seen as meaningless. In this work, we study how overparametrization and priors together reshape BNN posteriors and derive implications allowing us to better understand their interplay. We show that redundancy introduces three key phenomena that fundamentally reshape the posterior geometry: balancedness, weight reallocation on equal-probability manifolds, and prior conformity. We validate our findings through extensive experiments with posterior sampling budgets that far exceed those of earlier works, and demonstrate how overparametrization induces structured, prior-aligned weight posterior distributions.
论先验与过参数化在贝叶斯神经网络后验分布中的相互作用 /
On the Interplay of Priors and Overparametrization in Bayesian Neural Network Posteriors
1️⃣ 一句话总结
这篇论文通过理论分析和大量实验证明,在贝叶斯神经网络中,过参数化(即使用远超必要数量的参数)会与先验分布共同作用,重塑后验分布的结构,使其呈现出平衡性、权重重分配和先验一致性等规律性特征,从而让原本被认为难以处理的权重后验分布变得更有结构、更易于理解。