The final, formatted version of the article will be published soon.
ORIGINAL RESEARCH article
Front. Neurosci.
Sec. Neuromorphic Engineering
Volume 18 - 2024 |
doi: 10.3389/fnins.2024.1477979
A Recurrent YOLOv8-based Framework for Event-Based Object Detection
Provisionally accepted- 1 King Abdullah University of Science and Technology, Thuwal, Makkah, Saudi Arabia
- 2 Cairo University, Giza, Giza, Egypt
- 3 Compumacy for Artificial Intelligence solutions, Cairo, Egypt
Object detection is crucial in various cutting-edge applications, such as autonomous vehicles and advanced robotics systems, primarily relying on data from conventional frame-based RGB sensors. However, these sensors often struggle with issues like motion blur and poor performance in challenging lighting conditions. In response to these challenges, event-based cameras have emerged as an innovative paradigm. These cameras, inspired by biological vision systems mimicking the human eye, demonstrate superior performance in environments with fast motion and extreme lighting conditions while consuming less power. This study introduces Recurrent YOLOv8 (ReYOLOV8), an advanced object detection framework that enhances a leading framebased detection system with spatiotemporal modeling capabilities. We implemented a low-latency, memory-efficient method for encoding event data to boost the system's performance. Additionally, we developed a novel data augmentation technique tailored to leverage the unique attributes of event data, thus improving detection accuracy. Our framework underwent evaluation using the comprehensive event-based datasets Prophesee's Generation 1 (GEN1) and Person Detection for Robotics (PEDRo). Our models outperformed all comparable approaches in the GEN1 dataset, focusing on automotive applications, achieving mean Average Precision (mAP) improvements of 5%, 2.8%, and 2.5% across nano, small, and medium scales, respectively. These enhancements were achieved while reducing the number of trainable parameters by an average of 4.43% and maintaining real-time processing speeds between 9.2ms and 15.5ms. On the PEDRo dataset, which targets robotics applications, our models showed mAP improvements ranging from 9% to 18%, with 14.5x and 3.8x smaller models and an average speed enhancement of 1.67x.These results highlight the transformative potential of integrating event-based technologies with advanced object detection frameworks, paving the way for more robust and efficient visual processing systems in dynamic and challenging environments. These results highlight the potential of bio-inspired event-based vision sensors when integrated with advanced object detection frameworks. By bridging the gap between biological principles of vision and artificial 1 Silva et al. A Recurrent YOLOv8-based framework for Event-Based Object Detection intelligence, this work highlights the potential of event-based technologies for robust and efficient visual processing systems in dynamic and complex environments.
Keywords: object detection, YOLO, Event-based cameras, Data augmentation, Autonomous Driving
Received: 08 Aug 2024; Accepted: 20 Dec 2024.
Copyright: © 2024 Silva, smagulova, Elsheikh, Fouda and Eltawil. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence:
Mohammed Fouda, Compumacy for Artificial Intelligence solutions, Cairo, Egypt
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.