菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-17
📄 Abstract - Continuous-Time Piecewise-Linear Recurrent Neural Networks

In dynamical systems reconstruction (DSR) we aim to recover the dynamical system (DS) underlying observed time series. Specifically, we aim to learn a generative surrogate model which approximates the underlying, data-generating DS, and recreates its long-term properties (`climate statistics'). In scientific and medical areas, in particular, these models need to be mechanistically tractable -- through their mathematical analysis we would like to obtain insight into the recovered system's workings. Piecewise-linear (PL), ReLU-based RNNs (PLRNNs) have a strong track-record in this regard, representing SOTA DSR models while allowing mathematical insight by virtue of their PL design. However, all current PLRNN variants are discrete-time maps. This is in disaccord with the assumed continuous-time nature of most physical and biological processes, and makes it hard to accommodate data arriving at irregular temporal intervals. Neural ODEs are one solution, but they do not reach the DSR performance of PLRNNs and often lack their tractability. Here we develop theory for continuous-time PLRNNs (cPLRNNs): We present a novel algorithm for training and simulating such models, bypassing numerical integration by efficiently exploiting their PL structure. We further demonstrate how important topological objects like equilibria or limit cycles can be determined semi-analytically in trained models. We compare cPLRNNs to both their discrete-time cousins as well as Neural ODEs on DSR benchmarks, including systems with discontinuities which come with hard thresholds.

顶级标签: machine learning systems theory
详细标签: dynamical systems reconstruction piecewise-linear rnns continuous-time models neural odes model tractability 或 搜索:

连续时间分段线性循环神经网络 / Continuous-Time Piecewise-Linear Recurrent Neural Networks


1️⃣ 一句话总结

这篇论文提出了一种新的连续时间分段线性循环神经网络模型及其训练算法,它既能像传统离散时间模型一样有效重建复杂动力系统,又保持了数学上的可分析性,还能直接处理不规则时间间隔的数据。

源自 arXiv: 2602.15649