神经网络中同时实现泛化与记忆的规则-事实模型 / The Rules-and-Facts Model for Simultaneous Generalization and Memorization in Neural Networks
1️⃣ 一句话总结
这篇论文提出了一个名为‘规则-事实’的理论模型,解释了现代神经网络如何通过足够的参数容量和恰当的优化策略,在学会通用规则的同时,也能记住训练数据中的特定事实或例外情况。
A key capability of modern neural networks is their capacity to simultaneously learn underlying rules and memorize specific facts or exceptions. Yet, theoretical understanding of this dual capability remains limited. We introduce the Rules-and-Facts (RAF) model, a minimal solvable setting that enables precise characterization of this phenomenon by bridging two classical lines of work in the statistical physics of learning: the teacher-student framework for generalization and Gardner-style capacity analysis for memorization. In the RAF model, a fraction $1 - \varepsilon$ of training labels is generated by a structured teacher rule, while a fraction $\varepsilon$ consists of unstructured facts with random labels. We characterize when the learner can simultaneously recover the underlying rule - allowing generalization to new data - and memorize the unstructured examples. Our results quantify how overparameterization enables the simultaneous realization of these two objectives: sufficient excess capacity supports memorization, while regularization and the choice of kernel or nonlinearity control the allocation of capacity between rule learning and memorization. The RAF model provides a theoretical foundation for understanding how modern neural networks can infer structure while storing rare or non-compressible information.
神经网络中同时实现泛化与记忆的规则-事实模型 / The Rules-and-Facts Model for Simultaneous Generalization and Memorization in Neural Networks
这篇论文提出了一个名为‘规则-事实’的理论模型,解释了现代神经网络如何通过足够的参数容量和恰当的优化策略,在学会通用规则的同时,也能记住训练数据中的特定事实或例外情况。
源自 arXiv: 2603.25579