菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-03-23
📄 Abstract - The Golden Subspace: Where Efficiency Meets Generalization in Continual Test-Time Adaptation

Continual Test-Time Adaptation (CTTA) aims to enable models to adapt online to unlabeled data streams under distribution shift without accessing source data. Existing CTTA methods face an efficiency-generalization trade-off: updating more parameters improves adaptation but severely reduces online inference efficiency. An ideal solution is to achieve comparable adaptation with minimal feature updates; we call this minimal subspace the golden subspace. We prove its existence in a single-step adaptation setting and show that it coincides with the row space of the pretrained classifier. To enable online maintenance of this subspace, we introduce the sample-wise Average Gradient Outer Product (AGOP) as an efficient proxy for estimating the classifier weights without retraining. Building on these insights, we propose Guided Online Low-rank Directional adaptation (GOLD), which uses a lightweight adapter to project features onto the golden subspace and learns a compact scaling vector while the subspace is dynamically updated via AGOP. Extensive experiments on classification and segmentation benchmarks, including autonomous-driving scenarios, demonstrate that GOLD attains superior efficiency, stability, and overall performance. Our code is available at this https URL.

顶级标签: model training model evaluation machine learning
详细标签: continual learning test-time adaptation online adaptation distribution shift efficiency 或 搜索:

黄金子空间:持续测试时适应中效率与泛化的交汇点 / The Golden Subspace: Where Efficiency Meets Generalization in Continual Test-Time Adaptation


1️⃣ 一句话总结

这篇论文提出了一种名为GOLD的新方法,通过识别并动态维护一个关键的‘黄金子空间’,让AI模型在持续适应新数据时,既能保持高效快速的推理速度,又能获得出色的适应能力,有效解决了效率与性能难以兼顾的难题。

源自 arXiv: 2603.21928