马赛克学习:一种基于模型碎片化的去中心化学习框架 / Mosaic Learning: A Framework for Decentralized Learning with Model Fragmentation
1️⃣ 一句话总结
这篇论文提出了一种名为‘马赛克学习’的新方法,它通过将机器学习模型拆分成多个碎片并在网络中独立传播,从而在无需中央服务器的情况下,既减少了通信开销,又显著提升了去中心化协作学习的最终效果。
Decentralized learning (DL) enables collaborative machine learning (ML) without a central server, making it suitable for settings where training data cannot be centrally hosted. We introduce Mosaic Learning, a DL framework that decomposes models into fragments and disseminates them independently across the network. Fragmentation reduces redundant communication across correlated parameters and enables more diverse information propagation without increasing communication cost. We theoretically show that Mosaic Learning (i) shows state-of-the-art worst-case convergence rate, and (ii) leverages parameter correlation in an ML model, improving contraction by reducing the highest eigenvalue of a simplified system. We empirically evaluate Mosaic Learning on four learning tasks and observe up to 12 percentage points higher node-level test accuracy compared to epidemic learning (EL), a state-of-the-art baseline. In summary, Mosaic Learning improves DL performance without sacrificing its utility or efficiency, and positions itself as a new DL standard.
马赛克学习:一种基于模型碎片化的去中心化学习框架 / Mosaic Learning: A Framework for Decentralized Learning with Model Fragmentation
这篇论文提出了一种名为‘马赛克学习’的新方法,它通过将机器学习模型拆分成多个碎片并在网络中独立传播,从而在无需中央服务器的情况下,既减少了通信开销,又显著提升了去中心化协作学习的最终效果。
源自 arXiv: 2602.04352