菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-05-11
📄 Abstract - Variational Inference for Lévy Process-Driven SDEs via Neural Tilting

Modelling extreme events and heavy-tailed phenomena is central to building reliable predictive systems in domains such as finance, climate science, and safety-critical AI. While Lévy processes provide a natural mathematical framework for capturing jumps and heavy tails, Bayesian inference for Lévy-driven stochastic differential equations (SDEs) remains intractable with existing methods: Monte Carlo approaches are rigorous but lack scalability, whereas neural variational inference methods are efficient but rely on Gaussian assumptions that fail to capture discontinuities. We address this tension by introducing a neural exponential tilting framework for variational inference in Lévy-driven SDEs. Our approach constructs a flexible variational family by exponentially reweighting the Lévy measure using neural networks. This parametrization preserves the jump structure of the underlying process while remaining computationally tractable. To enable efficient inference, we develop a quadratic neural parametrization that yields closed-form normalization of the tilted measure, a conditional Gaussian representation for stable processes that facilitates simulation, and symmetry-aware Monte Carlo estimators for scalable optimization. Empirically, we demonstrate that the method accurately captures jump dynamics and yields reliable posterior inference in regimes where Gaussian-based variational approaches fail, on both synthetic and real-world datasets.

顶级标签: machine learning financial theory
详细标签: variational inference levy processes stochastic differential equations heavy tails neural exponential tilting 或 搜索:

基于神经倾斜的Lévy过程驱动随机微分方程变分推断 / Variational Inference for Lévy Process-Driven SDEs via Neural Tilting


1️⃣ 一句话总结

本文提出了一种结合神经网络与指数倾斜技术的变分推断方法,用于处理由Lévy过程驱动的随机微分方程,该方法既能保留数据的跳跃与重尾特性,又具有计算效率,弥补了传统蒙特卡洛方法可扩展性差和神经变分方法无法处理非连续性的缺陷。

源自 arXiv: 2605.10934