菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-04-16
📄 Abstract - A Nonlinear Separation Principle: Applications to Neural Networks, Control and Learning

This paper investigates continuous-time and discrete-time firing-rate and Hopfield recurrent neural networks (RNNs), with applications in nonlinear control design and implicit deep learning. First, we introduce a nonlinear separation principle that guarantees global exponential stability for the interconnection of a contracting state-feedback controller and a contracting observer, alongside parametric extensions for robustness and equilibrium tracking. Second, we derive sharp linear matrix inequality (LMI) conditions that guarantee the contractivity of both firing rate and Hopfield neural network architectures. We establish structural relationships among these certificates-demonstrating that continuous-time models with monotone non-decreasing activations maximize the admissible weight space, and extend these stability guarantees to interconnected systems and Graph RNNs. Third, we combine our separation principle and LMI framework to solve the output reference tracking problem for RNN-modeled plants. We provide LMI synthesis methods for feedback controllers and observers, and rigorously design a low-gain integral controller to eliminate steady-state error. Finally, we derive an exact, unconstrained algebraic parameterization of our contraction LMIs to design highly expressive implicit neural networks, achieving competitive accuracy and parameter efficiency on standard image classification benchmarks.

顶级标签: theory model training systems
详细标签: nonlinear control recurrent neural networks stability analysis linear matrix inequalities implicit neural networks 或 搜索:

一种非线性分离原理:在神经网络、控制与学习中的应用 / A Nonlinear Separation Principle: Applications to Neural Networks, Control and Learning


1️⃣ 一句话总结

这篇论文提出了一种保证闭环系统稳定性的非线性分离原理,并基于此设计了用于神经网络、控制系统和深度学习的通用分析与综合框架,实现了从稳定控制到高效神经网络模型构建的多种应用。

源自 arXiv: 2604.15238