菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-18
📄 Abstract - SEMixer: Semantics Enhanced MLP-Mixer for Multiscale Mixing and Long-term Time Series Forecasting

Modeling multiscale patterns is crucial for long-term time series forecasting (TSF). However, redundancy and noise in time series, together with semantic gaps between non-adjacent scales, make the efficient alignment and integration of multi-scale temporal dependencies challenging. To address this, we propose SEMixer, a lightweight multiscale model designed for long-term TSF. SEMixer features two key components: a Random Attention Mechanism (RAM) and a Multiscale Progressive Mixing Chain (MPMC). RAM captures diverse time-patch interactions during training and aggregates them via dropout ensemble at inference, enhancing patch-level semantics and enabling MLP-Mixer to better model multi-scale dependencies. MPMC further stacks RAM and MLP-Mixer in a memory-efficient manner, achieving more effective temporal mixing. It addresses semantic gaps across scales and facilitates better multiscale modeling and forecasting performance. We not only validate the effectiveness of SEMixer on 10 public datasets, but also on the \textit{2025 CCF AlOps Challenge} based on 21GB real wireless network data, where SEMixer achieves third place. The code is available at the link this https URL.

顶级标签: machine learning model training model evaluation
详细标签: time series forecasting multiscale modeling attention mechanism mlp-mixer lightweight architecture 或 搜索:

SEMixer:用于多尺度混合与长期时间序列预测的语义增强型MLP-Mixer / SEMixer: Semantics Enhanced MLP-Mixer for Multiscale Mixing and Long-term Time Series Forecasting


1️⃣ 一句话总结

这篇论文提出了一种名为SEMixer的轻量级模型,它通过创新的随机注意力机制和多尺度渐进混合链,有效解决了长期时间序列预测中多尺度模式建模的难题,并在多个公开数据集和真实工业挑战中验证了其优越性能。

源自 arXiv: 2602.16220