菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-24
📄 Abstract - SynthRender and IRIS: Open-Source Framework and Dataset for Bidirectional Sim-Real Transfer in Industrial Object Perception

Object perception is fundamental for tasks such as robotic material handling and quality inspection. However, modern supervised deep-learning perception models require large datasets for robust automation under semi-uncontrolled conditions. The cost of acquiring and annotating such data for proprietary parts is a major barrier for widespread deployment. In this context, we release SynthRender, an open source framework for synthetic image generation with Guided Domain Randomization capabilities. Furthermore, we benchmark recent Reality-to-Simulation techniques for 3D asset creation from 2D images of real parts. Combined with Domain Randomization, these synthetic assets provide low-overhead, transferable data even for parts lacking 3D files. We also introduce IRIS, the Industrial Real-Sim Imagery Set, containing 32 categories with diverse textures, intra-class variation, strong inter-class similarities and about 20,000 labels. Ablations on multiple benchmarks outline guidelines for efficient data generation with SynthRender. Our method surpasses existing approaches, achieving 99.1% mAP@50 on a public robotics dataset, 98.3% mAP@50 on an automotive benchmark, and 95.3% mAP@50 on IRIS.

顶级标签: computer vision robotics systems
详细标签: synthetic data generation domain randomization sim-to-real transfer object perception industrial vision 或 搜索:

SynthRender与IRIS:用于工业物体感知双向虚实迁移的开源框架与数据集 / SynthRender and IRIS: Open-Source Framework and Dataset for Bidirectional Sim-Real Transfer in Industrial Object Perception


1️⃣ 一句话总结

这篇论文提出了一个名为SynthRender的开源框架和一个名为IRIS的工业图像数据集,旨在通过合成图像生成和现实到仿真的双向技术,低成本、高效地解决工业物体识别中真实数据获取困难的问题,并在多个测试中取得了优异的识别准确率。

源自 arXiv: 2602.21141