菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-12
📄 Abstract - TS-Memory: Plug-and-Play Memory for Time Series Foundation Models

Time Series Foundation Models (TSFMs) achieve strong zero-shot forecasting through large-scale pre-training, but adapting them to downstream domains under distribution shift remains challenging. Existing solutions face a trade-off: Parametric Adaptation can cause catastrophic forgetting and requires costly multi-domain maintenance, while Non-Parametric Retrieval improves forecasts but incurs high inference latency due to datastore search. We propose Parametric Memory Distillation and implement it as TS-Memory, a lightweight memory adapter that augments frozen TSFMs. TS-Memory is trained in two stages. First, we construct an offline, leakage-safe kNN teacher that synthesizes confidence-aware quantile targets from retrieved futures. Second, we distill this retrieval-induced distributional correction into a lightweight memory adapter via confidence-gated supervision. During inference, TS-Memory fuses memory and backbone predictions with constant-time overhead, enabling retrieval-free deployment. Experiments across diverse TSFMs and benchmarks demonstrate consistent improvements in both point and probabilistic forecasting over representative adaptation methods, with efficiency comparable to the frozen backbone.

顶级标签: model training model evaluation machine learning
详细标签: time series forecasting foundation models parametric adaptation knowledge distillation distribution shift 或 搜索:

TS-Memory:用于时间序列基础模型的即插即用记忆模块 / TS-Memory: Plug-and-Play Memory for Time Series Foundation Models


1️⃣ 一句话总结

这篇论文提出了一种名为TS-Memory的轻量级记忆适配器,它通过一种两阶段的训练方法,将检索式方法的预测优势‘蒸馏’到一个参数化模块中,从而让时间序列基础模型在无需额外检索开销的情况下,就能更好地适应新领域的数据分布,同时保持高效推理。

源自 arXiv: 2602.11550