Skip to main content

ORIGINAL RESEARCH article

Front. Big Data
Sec. Machine Learning and Artificial Intelligence
Volume 7 - 2024 | doi: 10.3389/fdata.2024.1431346

Camera-view Supervision for Bird's-Eye-View Semantic Segmentation

Provisionally accepted
  • The University of Texas at Dallas, Richardson, United States

The final, formatted version of the article will be published soon.

    Bird's-eye-view Semantic Segmentation (BEVSS) is a powerful and crucial component of planning and control systems in many autonomous vehicles. Current methods rely on end-to-end learning to train models. We propose a novel method of supervising feature extraction with camera-view depth and segmentation information, which improves the quality of feature extraction and projection in the BEVSS pipeline. Through extensive empirical evaluation, we demonstrate that our approach achieves superior performance compared to existing methods, improving the robustness and reliability of BEVSS for autonomous driving systems. Our method achieves very competitive inference and training computational cost when compared to other real-time BEVSS methods, while maintaining superior accuracy. The codes and implementation details and code can be found at https://github.com/bluffish/sucam.

    Keywords: segmentation, Perception, Autonomous driving (AD), supervision, birds-eye-view, NuScenes Dataset

    Received: 11 May 2024; Accepted: 30 Oct 2024.

    Copyright: © 2024 Yang, Yu and Chen. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Bowen Yang, The University of Texas at Dallas, Richardson, United States

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.