菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-01-29
📄 Abstract - A Low-Complexity Plug-and-Play Deep Learning Model for Generalizable Massive MIMO Precoding

Massive multiple-input multiple-output (mMIMO) downlink precoding offers high spectral efficiency but remains challenging to deploy in practice because near-optimal algorithms such as the weighted minimum mean squared error (WMMSE) are computationally expensive, and sensitive to SNR and channel-estimation quality, while existing deep learning (DL)-based solutions often lack robustness and require retraining for each deployment site. This paper proposes a plug-and-play precoder (PaPP), a DL framework with a backbone that can be trained for either fully digital (FDP) or hybrid beamforming (HBF) precoding and reused across sites, transmit-power levels, and with varying amounts of channel estimation error, avoiding the need to train a new model from scratch at each deployment. PaPP combines a high-capacity teacher and a compact student with a self-supervised loss that balances teacher imitation and normalized sum-rate, trained using meta-learning domain-generalization and transmit-power-aware input normalization. Numerical results on ray-tracing data from three unseen sites show that the PaPP FDP and HBF models both outperform conventional and deep learning baselines, after fine-tuning with a small set of local unlabeled samples. Across both architectures, PaPP achieves more than 21$\times$ reduction in modeled computation energy and maintains good performance under channel-estimation errors, making it a practical solution for energy-efficient mMIMO precoding.

顶级标签: systems model training machine learning
详细标签: massive mimo precoding meta-learning domain generalization energy efficiency 或 搜索:

一种用于通用大规模MIMO预编码的低复杂度即插即用深度学习模型 / A Low-Complexity Plug-and-Play Deep Learning Model for Generalizable Massive MIMO Precoding


1️⃣ 一句话总结

这篇论文提出了一种名为PaPP的即插即用深度学习模型,它通过结合元学习和自监督训练,使得一个训练好的模型能够直接应用于不同基站、不同信号功率和不同信道质量的环境,无需为每个新场景重新训练,从而在保证高性能的同时,大幅降低了大规模MIMO预编码系统的计算能耗和部署难度。

源自 arXiv: 2601.21897