菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-03-30
📄 Abstract - Physically Inspired Gaussian Splatting for HDR Novel View Synthesis

High dynamic range novel view synthesis (HDR-NVS) reconstructs scenes with dynamic details by fusing multi-exposure low dynamic range (LDR) views, yet it struggles to capture ambient illumination-dependent appearance. Implicitly supervising HDR content by constraining tone-mapped results fails in correcting abnormal HDR values, and results in limited gradients for Gaussians in under/over-exposed regions. To this end, we introduce PhysHDR-GS, a physically inspired HDR-NVS framework that models scene appearance via intrinsic reflectance and adjustable ambient illumination. PhysHDR-GS employs a complementary image-exposure (IE) branch and Gaussian-illumination (GI) branch to faithfully reproduce standard camera observations and capture illumination-dependent appearance changes, respectively. During training, the proposed cross-branch HDR consistency loss provides explicit supervision for HDR content, while an illumination-guided gradient scaling strategy mitigates exposure-biased gradient starvation and reduces under-densified representations. Experimental results across realistic and synthetic datasets demonstrate our superiority in reconstructing HDR details (e.g., a PSNR gain of 2.04 dB over HDR-GS), while maintaining real-time rendering speed (up to 76 FPS). Code and models are available at this https URL.

顶级标签: computer vision model training systems
详细标签: novel view synthesis gaussian splatting high dynamic range 3d reconstruction real-time rendering 或 搜索:

用于高动态范围新视角合成的物理启发式高斯溅射方法 / Physically Inspired Gaussian Splatting for HDR Novel View Synthesis


1️⃣ 一句话总结

这篇论文提出了一种名为PhysHDR-GS的新方法,它通过模拟场景的物理反射特性和可调节的环境光照,有效解决了现有技术在合成高动态范围新视角时难以准确捕捉光照变化和细节的难题,在提升画面质量的同时保持了实时渲染速度。

源自 arXiv: 2603.28020