级联迁移:在预算约束下学习多个任务 / Cascaded Transfer: Learning Many Tasks under Budget Constraints
1️⃣ 一句话总结
这篇论文提出了一种名为‘级联迁移学习’的新方法,它像流水线一样,让信息(比如模型参数)沿着一个根据任务相似性构建的树状结构,在多个相关任务间依次传递和优化,从而在有限的训练资源(预算)下,更高效、更准确地完成大量任务的学习。
Many-Task Learning refers to the setting where a large number of related tasks need to be learned, the exact relationships between tasks are not known. We introduce the Cascaded Transfer Learning, a novel many-task transfer learning paradigm where information (e.g. model parameters) cascades hierarchically through tasks that are learned by individual models of the same class, while respecting given budget constraints. The cascade is organized as a rooted tree that specifies the order in which tasks are learned and refined. We design a cascaded transfer mechanism deployed over a minimum spanning tree structure that connects the tasks according to a suitable distance measure, and allocates the available training budget along its branches. Experiments on synthetic and real many-task settings show that the resulting method enables more accurate and cost effective adaptation across large task collections compared to alternative approaches.
级联迁移:在预算约束下学习多个任务 / Cascaded Transfer: Learning Many Tasks under Budget Constraints
这篇论文提出了一种名为‘级联迁移学习’的新方法,它像流水线一样,让信息(比如模型参数)沿着一个根据任务相似性构建的树状结构,在多个相关任务间依次传递和优化,从而在有限的训练资源(预算)下,更高效、更准确地完成大量任务的学习。
源自 arXiv: 2601.21513