SEF-MAP:用于鲁棒多模态高清地图预测的子空间分解专家融合方法 / SEF-MAP: Subspace-Decomposed Expert Fusion for Robust Multimodal HD Map Prediction
1️⃣ 一句话总结
这篇论文提出了一种名为SEF-MAP的新方法,它通过将不同传感器(如摄像头和激光雷达)的数据特征分解到不同的语义子空间,并让专门的‘专家’模块分别处理,再智能地融合结果,从而在各种恶劣条件下都能更可靠、更准确地预测自动驾驶所需的高清地图。
High-definition (HD) maps are essential for autonomous driving, yet multi-modal fusion often suffers from inconsistency between camera and LiDAR modalities, leading to performance degradation under low-light conditions, occlusions, or sparse point clouds. To address this, we propose SEFMAP, a Subspace-Expert Fusion framework for robust multimodal HD map prediction. The key idea is to explicitly disentangle BEV features into four semantic subspaces: LiDAR-private, Image-private, Shared, and Interaction. Each subspace is assigned a dedicated expert, thereby preserving modality-specific cues while capturing cross-modal consensus. To adaptively combine expert outputs, we introduce an uncertainty-aware gating mechanism at the BEV-cell level, where unreliable experts are down-weighted based on predictive variance, complemented by a usage balance regularizer to prevent expert collapse. To enhance robustness in degraded conditions and promote role specialization, we further propose distribution-aware masking: during training, modality-drop scenarios are simulated using EMA-statistical surrogate features, and a specialization loss enforces distinct behaviors of private, shared, and interaction experts across complete and masked inputs. Experiments on nuScenes and Argoverse2 benchmarks demonstrate that SEFMAP achieves state-of-the-art performance, surpassing prior methods by +4.2% and +4.8% in mAP, respectively. SEF-MAPprovides a robust and effective solution for multi-modal HD map prediction under diverse and degraded conditions.
SEF-MAP:用于鲁棒多模态高清地图预测的子空间分解专家融合方法 / SEF-MAP: Subspace-Decomposed Expert Fusion for Robust Multimodal HD Map Prediction
这篇论文提出了一种名为SEF-MAP的新方法,它通过将不同传感器(如摄像头和激光雷达)的数据特征分解到不同的语义子空间,并让专门的‘专家’模块分别处理,再智能地融合结果,从而在各种恶劣条件下都能更可靠、更准确地预测自动驾驶所需的高清地图。
源自 arXiv: 2602.21589