菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-03-16
📄 Abstract - Cross-RAG: Zero-Shot Retrieval-Augmented Time Series Forecasting via Cross-Attention

Recent advances in time series foundation models (TSFMs) demonstrate strong expressive capacity through large-scale pretraining across diverse time series domains. Zero-shot time series forecasting with TSFMs, however, exhibits limited generalization to unseen datasets, which retrieval-augmented forecasting addresses by leveraging an external knowledge base. Existing approaches rely on a fixed number of retrieved samples that may introduce irrelevant information. To this end, we propose Cross-RAG, a zero-shot retrieval-augmented forecasting framework that selectively attends to query-relevant retrieved samples. Cross-RAG models input-level relevance between the query and retrieved samples via query-retrieval cross-attention, while jointly incorporating information from the query and retrieved samples. Extensive experiments demonstrate that Cross-RAG consistently improves zero-shot forecasting performance across various TSFMs and RAG methods, and additional analyses confirm its effectiveness across diverse retrieval scenarios. Code is available at this https URL.

顶级标签: machine learning model evaluation systems
详细标签: time series forecasting retrieval-augmented generation zero-shot learning cross-attention foundation models 或 搜索:

Cross-RAG:一种通过交叉注意力实现零样本检索增强的时间序列预测方法 / Cross-RAG: Zero-Shot Retrieval-Augmented Time Series Forecasting via Cross-Attention


1️⃣ 一句话总结

这篇论文提出了一种名为Cross-RAG的新方法,它通过一种智能的交叉注意力机制,让时间序列预测模型能够从外部知识库中自动筛选并利用最相关的历史数据,从而显著提升了模型在未见过的数据集上进行零样本预测的准确性和泛化能力。

源自 arXiv: 2603.14709