菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-04-06
📄 Abstract - Minimaxity and Admissibility of Bayesian Neural Networks

Bayesian neural networks (BNNs) offer a natural probabilistic formulation for inference in deep learning models. Despite their popularity, their optimality has received limited attention through the lens of statistical decision theory. In this paper, we study decision rules induced by deep, fully connected feedforward ReLU BNNs in the normal location model under quadratic loss. We show that, for fixed prior scales, the induced Bayes decision rule is not minimax. We then propose a hyperprior on the effective output variance of the BNN prior that yields a superharmonic square-root marginal density, establishing that the resulting decision rule is simultaneously admissible and minimax. We further extend these results from the quadratic loss setting to the predictive density estimation problem with Kullback--Leibler loss. Finally, we validate our theoretical findings numerically through simulation.

顶级标签: machine learning theory
详细标签: bayesian neural networks statistical decision theory minimaxity admissibility predictive density estimation 或 搜索:

贝叶斯神经网络的最小最大性与可容许性 / Minimaxity and Admissibility of Bayesian Neural Networks


1️⃣ 一句话总结

这篇论文从统计决策理论的角度,证明了在特定条件下,通过引入一种超先验来调整贝叶斯神经网络的输出方差,可以使其决策规则同时满足最优的‘最小最大性’和‘可容许性’,从而在理论上保证了模型的统计最优性。

源自 arXiv: 2604.04673