Skip to main content

ORIGINAL RESEARCH article

Front. Robot. AI
Sec. Robot Vision and Artificial Perception
Volume 11 - 2024 | doi: 10.3389/frobt.2024.1450266
This article is part of the Research Topic Using Sensors to Achieve Safety and Accurate Control in Robotics View all articles

Simulation and Real-Life Implementation of UAV Autonomous Landing System Based on Object Recognition and Tracking for Safe Landing in Uncertain Environments

Provisionally accepted
  • 1 Kpro System, Namyangju, Republic of Korea
  • 2 Chodang University, Muan County, South Jeolla, Republic of Korea

The final, formatted version of the article will be published soon.

    The use of autonomous Unmanned Aerial Vehicles (UAVs) has been increasing, and the autonomy of these systems and their capabilities in dealing with uncertainties is crucial.Autonomous landing is pivotal for the success of an autonomous mission of UAVs. This paper presents an autonomous landing system for quadrotor UAVs with the ability to perform smooth landing even in undesirable conditions like obstruction by obstacles in and around the designated landing area and inability to identify or the absence of a visual marker establishing the designated landing area. We have integrated algorithms like version 5 of You Only Look Once (YOLOv5), DeepSORT, Euclidean distance transform, and Proportional-Integral-Derivative (PID) controller to strengthen the robustness of the overall system. While the YOLOv5 model is trained to identify the visual marker of the landing area and some common obstacles like people, cars, and trees, the DeepSORT algorithm keeps track of the identified objects. Similarly, using the detection of the identified objects and Euclidean distance transform, an open space without any obstacles to land could be identified if necessary. Finally, the PID controller generates appropriate movement values for the UAV using the visual cues of the target landing area and the obstacles. To warrant the validity of the overall system without risking the safety of the involved people, initial tests are performed, and a software-based simulation is performed before executing the tests in real life. A full-blown hardware system with an autonomous landing system is then built and tested in real life. The designed system is tested in various scenarios to verify the effectiveness of the system. The code is available at this repository: https://github.com/rnjbdya/Vision-based-UAV-autonomous-landing

    Keywords: Autonomous landing, Deep sort, Distance transform, Intelligent autonomous system, obstacle avoidance, object detection, PID control, YOLOv5

    Received: 17 Jun 2024; Accepted: 10 Sep 2024.

    Copyright: © 2024 Baidya and Jeong. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Heon Jeong, Chodang University, Muan County, 534-701, South Jeolla, Republic of Korea

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.