对比持续学习在物联网中的模型适应性研究 / Contrastive Continual Learning for Model Adaptability in Internet of Things
1️⃣ 一句话总结
这篇论文探讨了如何将对比学习与持续学习相结合,帮助物联网设备上的AI模型在不断变化的真实环境中持续学习新知识而不遗忘旧技能,同时兼顾了设备计算能力弱、网络不稳定和隐私保护等实际限制。
Internet of Things (IoT) deployments operate in nonstationary, dynamic environments where factors such as sensor drift, evolving user behavior, and heterogeneous user privacy requirements can affect application utility. Continual learning (CL) addresses this by adapting models over time without catastrophic forgetting. Meanwhile, contrastive learning has emerged as a powerful representation-learning paradigm that improves robustness and sample efficiency in a self-supervised manner. This paper reviews the usage of \emph{contrastive continual learning} (CCL) for IoT, connecting algorithmic design (replay, regularization, distillation, prompts) with IoT system realities (TinyML constraints, intermittent connectivity, privacy). We present a unifying problem formulation, derive common objectives that blend contrastive and distillation losses, propose an IoT-oriented reference architecture for on-device, edge, and cloud-based CCL, and provide guidance on evaluation protocols and metrics. Finally, we highlight open unique challenges with respect to the IoT domain, such as spanning tabular and streaming IoT data, concept drift, federated settings, and energy-aware training.
对比持续学习在物联网中的模型适应性研究 / Contrastive Continual Learning for Model Adaptability in Internet of Things
这篇论文探讨了如何将对比学习与持续学习相结合,帮助物联网设备上的AI模型在不断变化的真实环境中持续学习新知识而不遗忘旧技能,同时兼顾了设备计算能力弱、网络不稳定和隐私保护等实际限制。
源自 arXiv: 2602.04881