菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-03-18
📄 Abstract - Baguan-TS: A Sequence-Native In-Context Learning Model for Time Series Forecasting with Covariates

Transformers enable in-context learning (ICL) for rapid, gradient-free adaptation in time series forecasting, yet most ICL-style approaches rely on tabularized, hand-crafted features, while end-to-end sequence models lack inference-time adaptation. We bridge this gap with a unified framework, Baguan-TS, which integrates the raw-sequence representation learning with ICL, instantiated by a 3D Transformer that attends jointly over temporal, variable, and context axes. To make this high-capacity model practical, we tackle two key hurdles: (i) calibration and training stability, improved with a feature-agnostic, target-space retrieval-based local calibration; and (ii) output oversmoothing, mitigated via context-overfitting strategy. On public benchmark with covariates, Baguan-TS consistently outperforms established baselines, achieving the highest win rate and significant reductions in both point and probabilistic forecasting metrics. Further evaluations across diverse real-world energy datasets demonstrate its robustness, yielding substantial improvements.

顶级标签: machine learning model training model evaluation
详细标签: time series forecasting in-context learning transformers covariates 3d attention 或 搜索:

Baguan-TS:一种用于带协变量的时间序列预测的序列原生上下文学习模型 / Baguan-TS: A Sequence-Native In-Context Learning Model for Time Series Forecasting with Covariates


1️⃣ 一句话总结

这篇论文提出了一个名为Baguan-TS的新模型,它巧妙地将原始序列学习与上下文学习结合起来,通过一个三维Transformer同时处理时间、变量和上下文信息,有效解决了现有方法在训练稳定性和预测精度上的不足,从而在多种时间序列预测任务中取得了更优的性能。

源自 arXiv: 2603.17439