菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-03-30
📄 Abstract - OptINC: Optical In-Network-Computing for Scalable Distributed Learning

Distributed learning is widely used for training large models on large datasets by distributing parts of the model or dataset across multiple devices and aggregating the computed results for subsequent computations or parameter updates. Existing communication algorithms for distributed learning such as ring all-reduce result in heavy communication overhead between servers. Since communication in large-scale systems uses optical fibers, we propose an Optical In-Network-Computing (OptINC) architecture to offload the computation in servers onto the optical interconnects. To execute gradient averaging and quantization in the optical domain, we incorporate optical devices such as Mach-Zehnder-Interferometers (MZIs) into the interconnects. Such a de facto optical neural network (ONN) can effectively reduce the communication overhead in existing distributed training solutions. To reduce dataset complexity for training this neural network, a preprocessing algorithm implemented in the optical domain is also proposed. Hardware cost is lowered by approximating the weight matrices of the optical neural network with unitary and diagonal matrices, while the accuracy is maintained by a proposed hardware-aware training algorithm. The proposed solution was evaluated on real distributed learning tasks, including ResNet50 on CIFAR-100, and a LLaMA-based network on Wikipedia-1B. In both cases, the proposed framework can achieve comparable training accuracy to the ring all-reduce baseline, while eliminating communication overhead.

顶级标签: systems machine learning model training
详细标签: optical computing distributed learning in-network computing hardware acceleration gradient averaging 或 搜索:

OptINC:面向可扩展分布式学习的光学网络内计算 / OptINC: Optical In-Network-Computing for Scalable Distributed Learning


1️⃣ 一句话总结

这篇论文提出了一种名为OptINC的新型光学网络内计算架构,通过在光纤互连中直接利用光学器件进行梯度平均和量化计算,从而有效消除了大规模分布式机器学习训练中的通信开销,同时保持了与现有方法相当的模型训练精度。

源自 arXiv: 2603.28290