[2602.23871] Bandwidth-adaptive Cloud-Assisted 360-Degree 3D Perception for Autonomous Vehicles
About this article
Abstract page for arXiv paper 2602.23871: Bandwidth-adaptive Cloud-Assisted 360-Degree 3D Perception for Autonomous Vehicles
Computer Science > Computer Vision and Pattern Recognition arXiv:2602.23871 (cs) [Submitted on 27 Feb 2026] Title:Bandwidth-adaptive Cloud-Assisted 360-Degree 3D Perception for Autonomous Vehicles Authors:Faisal Hawladera, Rui Meireles, Gamal Elghazaly, Ana Aguiar, Raphaël Frank View a PDF of the paper titled Bandwidth-adaptive Cloud-Assisted 360-Degree 3D Perception for Autonomous Vehicles, by Faisal Hawladera and 4 other authors View PDF Abstract:A key challenge for autonomous driving lies in maintaining real-time situational awareness regarding surrounding obstacles under strict latency constraints. The high processing requirements coupled with limited onboard computational resources can cause delay issues, particularly in complex urban settings. To address this, we propose leveraging Vehicle-to-Everything (V2X) communication to partially offload processing to the cloud, where compute resources are abundant, thus reducing overall latency. Our approach utilizes transformer-based models to fuse multi-camera sensor data into a comprehensive Bird's-Eye View (BEV) representation, enabling accurate 360-degree 3D object detection. The computation is dynamically split between the vehicle and the cloud based on the number of layers processed locally and the quantization level of the features. To further reduce network load, we apply feature vector clipping and compression prior to transmission. In a real-world experimental evaluation, our hybrid strategy achieved a 72 \% reducti...