[2510.12901] SimULi: Real-Time LiDAR and Camera Simulation with Unscented Transforms
About this article
Abstract page for arXiv paper 2510.12901: SimULi: Real-Time LiDAR and Camera Simulation with Unscented Transforms
Computer Science > Computer Vision and Pattern Recognition arXiv:2510.12901 (cs) [Submitted on 14 Oct 2025 (v1), last revised 28 Mar 2026 (this version, v3)] Title:SimULi: Real-Time LiDAR and Camera Simulation with Unscented Transforms Authors:Haithem Turki, Qi Wu, Xin Kang, Janick Martinez Esturo, Shengyu Huang, Ruilong Li, Zan Gojcic, Riccardo de Lutio View a PDF of the paper titled SimULi: Real-Time LiDAR and Camera Simulation with Unscented Transforms, by Haithem Turki and 7 other authors View PDF HTML (experimental) Abstract:Rigorous testing of autonomous robots, such as self-driving vehicles, is essential to ensure their safety in real-world deployments. This requires building high-fidelity simulators to test scenarios beyond those that can be safely or exhaustively collected in the real-world. Existing neural rendering methods based on NeRF and 3DGS hold promise but suffer from low rendering speeds or can only render pinhole camera models, hindering their suitability to applications that commonly require high-distortion lenses and LiDAR data. Multi-sensor simulation poses additional challenges as existing methods handle cross-sensor inconsistencies by favoring the quality of one modality at the expense of others. To overcome these limitations, we propose SimULi, the first method capable of rendering arbitrary camera models and LiDAR data in real-time. Our method extends 3DGUT, which natively supports complex camera models, with LiDAR support, via an automated tiling...