菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-24
📄 Abstract - Event-Aided Sharp Radiance Field Reconstruction for Fast-Flying Drones

Fast-flying aerial robots promise rapid inspection under limited battery constraints, with direct applications in infrastructure inspection, terrain exploration, and search and rescue. However, high speeds lead to severe motion blur in images and induce significant drift and noise in pose estimates, making dense 3D reconstruction with Neural Radiance Fields (NeRFs) particularly challenging due to their high sensitivity to such degradations. In this work, we present a unified framework that leverages asynchronous event streams alongside motion-blurred frames to reconstruct high-fidelity radiance fields from agile drone flights. By embedding event-image fusion into NeRF optimization and jointly refining event-based visual-inertial odometry priors using both event and frame modalities, our method recovers sharp radiance fields and accurate camera trajectories without ground-truth supervision. We validate our approach on both synthetic data and real-world sequences captured by a fast-flying drone. Despite highly dynamic drone flights, where RGB frames are severely degraded by motion blur and pose priors become unreliable, our method reconstructs high-fidelity radiance fields and preserves fine scene details, delivering a performance gain of over 50% on real-world data compared to state-of-the-art methods.

顶级标签: computer vision robotics systems
详细标签: neural radiance fields event cameras visual-inertial odometry 3d reconstruction motion blur 或 搜索:

面向快速飞行无人机的、基于事件辅助的清晰辐射场重建 / Event-Aided Sharp Radiance Field Reconstruction for Fast-Flying Drones


1️⃣ 一句话总结

这项研究提出了一种新方法,通过融合高速飞行无人机拍摄的模糊图像和异步事件流数据,成功重建出清晰、高保真的三维场景模型,解决了快速运动导致图像模糊和定位不准的难题。

源自 arXiv: 2602.21101