Skip to main content

ORIGINAL RESEARCH article

Front. Robot. AI
Sec. Robot Vision and Artificial Perception
Volume 11 - 2024 | doi: 10.3389/frobt.2024.1490812
This article is part of the Research Topic Computer Vision Mechanisms for Resource-Constrained Robotics Applications View all 6 articles

A Versatile Real-Time Vision-Led Runway Localisation System for Enhanced Autonomy

Provisionally accepted
Kyriacos Tsapparellas Kyriacos Tsapparellas 1Nickolay Jelev Nickolay Jelev 2Jonathon Waters Jonathon Waters 3Aditya M Shrikhande Aditya M Shrikhande 1*Sabine Brunswicker Sabine Brunswicker 4Lyudmila S Mihaylova Lyudmila S Mihaylova 1
  • 1 The University of Sheffield, Sheffield, United Kingdom
  • 2 Windracers, Southampton, United Kingdom
  • 3 Distributed Avionics, Southampton, United Kingdom
  • 4 Purdue University, West Lafayette, Indiana, United States

The final, formatted version of the article will be published soon.

    This paper proposes a solution to the challenging task of autonomously landing Unmanned Aerial Vehicles (UAVs). The main contribution of this work consists of a vision-led system for runway detection to aid autonomous landing of UAVs using electro-optical cameras. An onboard computer vision module integrates the vision system with the ground control communication and video server connection. The vision platform performs feature extraction using the Speeded-Up Robust Features (SURF) [1], followed by fast Structured Forests edge detection [2] and then smoothing with a Kalman filter for accurate runway sidelines prediction. A thorough evaluation is performed over real-world and simulation environments with respect to accuracy and processing time, in comparison with the results with state-of-the-art edge detection approaches. The vision system is validated over videos with clear and difficult weather conditions, including with fog, varying lighting conditions and crosswind landing. The experiments are performed using data from the X-Plane 11 flight simulator and real flight data from the Uncrewed Low-cost TRAnsport (ULTRA) self-flying cargo UAV [3]. The vision-led system can localise the runway sidelines with a Structured Forests approach with an accuracy approximately 84.4%, outperforming the state-of-the-art approaches and delivering real-time performance. Although developed for the ULTRA UAV, the vision-led system is applicable to any other UAV.

    Keywords: aerial systems: perception and autonomy, vision-based navigation, Computer Vision for Automation, Autonomous landing, Autonomous vehicle navigation

    Received: 03 Sep 2024; Accepted: 11 Nov 2024.

    Copyright: © 2024 Tsapparellas, Jelev, Waters, Shrikhande, Brunswicker and Mihaylova. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Aditya M Shrikhande, The University of Sheffield, Sheffield, United Kingdom

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.