菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-03-30
📄 Abstract - Event6D: Event-based Novel Object 6D Pose Tracking

Event cameras provide microsecond latency, making them suitable for 6D object pose tracking in fast, dynamic scenes where conventional RGB and depth pipelines suffer from motion blur and large pixel displacements. We introduce EventTrack6D, an event-depth tracking framework that generalizes to novel objects without object-specific training by reconstructing both intensity and depth at arbitrary timestamps between depth frames. Conditioned on the most recent depth measurement, our dual reconstruction recovers dense photometric and geometric cues from sparse event streams. Our EventTrack6D operates at over 120 FPS and maintains temporal consistency under rapid motion. To support training and evaluation, we introduce a comprehensive benchmark suite: a large-scale synthetic dataset for training and two complementary evaluation sets, including real and simulated event datasets. Trained exclusively on synthetic data, EventTrack6D generalizes effectively to real-world scenarios without fine-tuning, maintaining accurate tracking across diverse objects and motion patterns. Our method and datasets validate the effectiveness of event cameras for event-based 6D pose tracking of novel objects. Code and datasets are publicly available at this https URL.

顶级标签: robotics computer vision systems
详细标签: 6d pose tracking event cameras novel objects synthetic training real-time tracking 或 搜索:

Event6D:基于事件相机的新物体六维姿态追踪 / Event6D: Event-based Novel Object 6D Pose Tracking


1️⃣ 一句话总结

这项研究提出了一种名为EventTrack6D的新方法,它利用高速事件相机,无需针对特定物体进行训练,就能在快速动态场景中实时、准确地追踪各种新物体的三维位置和朝向,并发布了配套的训练与评估数据集。

源自 arXiv: 2603.28045