通过数据混合适配时间序列基础模型 / Adapting Time Series Foundation Models through Data Mixtures
1️⃣ 一句话总结
这篇论文提出了一种名为MixFT的新方法,它通过智能地重新划分和混合数据来微调时间序列基础模型,从而比传统方法更能让模型适应新领域中的不同数据模式,最终提升其在零样本预测任务上的表现。
Time series foundation models (TSFMs) have become increasingly popular for zero-shot forecasting. However, for a new time series domain not fully covered by the pretraining set, performance can suffer. Therefore, when a practitioner cares about a new domain and has access to a set of related datasets, the question arises: how best to fine-tune a TSFM to improve zero-shot forecasting? A typical approach to this type of problem is to fine-tune a LoRA module on all datasets or separately on each dataset. Tuning a separate module on each dataset allows for the specialisation of the TSFM to different types of data distribution, by selecting differing combinations of per-dataset modules for different time series contexts. However, we find that, using per-dataset modules might not be optimal, since a time series dataset can contain data from several types of distributions, i.e. sub-domains. This can be due to the distribution shifting or having differing distributions for different dimensions of the time series. Hence, we propose MixFT which re-divides the data using Bayesian mixtures into sets that best represent the sub-domains present in the data, and fine-tunes separately on each of these sets. This re-division of the data ensures that each set is more homogeneous, leading to fine-tuned modules focused on specific sub-domains. Our experiments show that MixFT performs better than per-dataset methods and when fine-tuning a single module on all the data. This suggests that by re-partitioning the data to represent sub-domains we can better specialise TSFMs to improve zero-shot forecasting.
通过数据混合适配时间序列基础模型 / Adapting Time Series Foundation Models through Data Mixtures
这篇论文提出了一种名为MixFT的新方法,它通过智能地重新划分和混合数据来微调时间序列基础模型,从而比传统方法更能让模型适应新领域中的不同数据模式,最终提升其在零样本预测任务上的表现。
源自 arXiv: 2603.02840