生长、评估、压缩:面向内存高效类增量学习的自适应主干网络缩放 / Grow, Assess, Compress: Adaptive Backbone Scaling for Memory-Efficient Class Incremental Learning
1️⃣ 一句话总结
这篇论文提出了一种名为GRACE的自适应模型缩放框架,它通过‘生长、评估、压缩’的循环策略,在持续学习新任务时智能地管理模型规模,既有效防止了遗忘旧知识,又大幅降低了内存消耗。
Class Incremental Learning (CIL) poses a fundamental challenge: maintaining a balance between the plasticity required to learn new tasks and the stability needed to prevent catastrophic forgetting. While expansion-based methods effectively mitigate forgetting by adding task-specific parameters, they suffer from uncontrolled architectural growth and memory overhead. In this paper, we propose a novel dynamic scaling framework that adaptively manages model capacity through a cyclic "GRow, Assess, ComprEss" (GRACE) strategy. Crucially, we supplement backbone expansion with a novel saturation assessment phase that evaluates the utilization of the model's capacity. This assessment allows the framework to make informed decisions to either expand the architecture or compress the backbones into a streamlined representation, preventing parameter explosion. Experimental results demonstrate that our approach achieves state-of-the-art performance across multiple CIL benchmarks, while reducing memory footprint by up to a 73% compared to purely expansionist models.
生长、评估、压缩:面向内存高效类增量学习的自适应主干网络缩放 / Grow, Assess, Compress: Adaptive Backbone Scaling for Memory-Efficient Class Incremental Learning
这篇论文提出了一种名为GRACE的自适应模型缩放框架,它通过‘生长、评估、压缩’的循环策略,在持续学习新任务时智能地管理模型规模,既有效防止了遗忘旧知识,又大幅降低了内存消耗。
源自 arXiv: 2603.08426