菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-04-21
📄 Abstract - Phase Transitions in the Fluctuations of Functionals of Random Neural Networks

We establish central and non-central limit theorems for sequences of functionals of the Gaussian output of an infinitely-wide random neural network on the d-dimensional sphere . We show that the asymptotic behaviour of these functionals as the depth of the network increases depends crucially on the fixed points of the covariance function, resulting in three distinct limiting regimes: convergence to the same functional of a limiting Gaussian field, convergence to a Gaussian distribution, convergence to a distribution in the Qth Wiener chaos. Our proofs exploit tools that are now classical (Hermite expansions, Diagram Formula, Stein-Malliavin techniques), but also ideas which have never been used in similar contexts: in particular, the asymptotic behaviour is determined by the fixed-point structure of the iterative operator associated with the covariance, whose nature and stability governs the different limiting regimes.

顶级标签: theory machine learning
详细标签: neural networks random neural networks phase transitions central limit theorem wiener chaos 或 搜索:

随机神经网络泛函波动的相变现象 / Phase Transitions in the Fluctuations of Functionals of Random Neural Networks


1️⃣ 一句话总结

该论文研究了深度随机神经网络的输出泛函在层数增加时的统计行为,发现其波动模式取决于协方差函数的固定点,并由此产生三种截然不同的极限状态:要么趋于一个固定高斯场的同一泛函,要么收敛于普通正态分布,要么落入高阶维纳混沌分布,类似于物理中的相变过程。

源自 arXiv: 2604.19738