深度TPC:用于时间序列预测的时间先验条件化 / Deep TPC: Temporal-Prior Conditioning for Time Series Forecasting
1️⃣ 一句话总结
这篇论文提出了一种名为‘时间先验条件化’的新方法,通过让时间信息在大型语言模型的多个层级中持续发挥作用,显著提升了时间序列预测的准确性,超越了现有主流技术。
LLM-for-time series (TS) methods typically treat time shallowly, injecting positional or prompt-based cues once at the input of a largely frozen decoder, which limits temporal reasoning as this information degrades through the layers. We introduce Temporal-Prior Conditioning (TPC), which elevates time to a first-class modality that conditions the model at multiple depths. TPC attaches a small set of learnable time series tokens to the patch stream; at selected layers these tokens cross-attend to temporal embeddings derived from compact, human-readable temporal descriptors encoded by the same frozen LLM, then feed temporal context back via self-attention. This disentangles time series signal and temporal information while maintaining a low parameter budget. We show that by training only the cross-attention modules and explicitly disentangling time series signal and temporal information, TPC consistently outperforms both full fine-tuning and shallow conditioning strategies, achieving state-of-the-art performance in long-term forecasting across diverse datasets. Code available at: this https URL
深度TPC:用于时间序列预测的时间先验条件化 / Deep TPC: Temporal-Prior Conditioning for Time Series Forecasting
这篇论文提出了一种名为‘时间先验条件化’的新方法,通过让时间信息在大型语言模型的多个层级中持续发挥作用,显著提升了时间序列预测的准确性,超越了现有主流技术。
源自 arXiv: 2602.16188