RefracGS:通过折射水面与3D高斯光线追踪的新视角合成 / RefracGS: Novel View Synthesis Through Refractive Water Surfaces with 3D Gaussian Ray Tracing
1️⃣ 一句话总结
这篇论文提出了一种名为RefracGS的新方法,它能够通过联合重建水面波纹和水中场景,准确模拟光线在水面折射的复杂路径,从而高效、逼真地合成从任意角度观察水下场景的图像。
Novel view synthesis (NVS) through non-planar refractive surfaces presents fundamental challenges due to severe, spatially varying optical distortions. While recent representations like NeRF and 3D Gaussian Splatting (3DGS) excel at NVS, their assumption of straight-line ray propagation fails under these conditions, leading to significant artifacts. To overcome this limitation, we introduce RefracGS, a framework that jointly reconstructs the refractive water surface and the scene beneath the interface. Our key insight is to explicitly decouple the refractive boundary from the target objects: the refractive surface is modeled via a neural height field, capturing wave geometry, while the underlying scene is represented as a 3D Gaussian field. We formulate a refraction-aware Gaussian ray tracing approach that accurately computes non-linear ray trajectories using Snell's law and efficiently renders the underlying Gaussian field while backpropagating the loss gradients to the parameterized refractive surface. Through end-to-end joint optimization of both representations, our method ensures high-fidelity NVS and view-consistent surface recovery. Experiments on both synthetic and real-world scenes with complex waves demonstrate that RefracGS outperforms prior refractive methods in visual quality, while achieving 15x faster training and real-time rendering at 200 FPS. The project page for RefracGS is available at this https URL.
RefracGS:通过折射水面与3D高斯光线追踪的新视角合成 / RefracGS: Novel View Synthesis Through Refractive Water Surfaces with 3D Gaussian Ray Tracing
这篇论文提出了一种名为RefracGS的新方法,它能够通过联合重建水面波纹和水中场景,准确模拟光线在水面折射的复杂路径,从而高效、逼真地合成从任意角度观察水下场景的图像。
源自 arXiv: 2603.21695