Skip to main content

ORIGINAL RESEARCH article

Front. Virtual Real.
Sec. Technologies for VR
Volume 5 - 2024 | doi: 10.3389/frvir.2024.1230885
This article is part of the Research Topic Immersive and XR Interfaces for Exploration, Monitoring, and Intervention through Teleoperation View all articles

Visual Augmentation of Live-Streaming Images in Virtual Reality to Enhance Teleoperation of Unmanned Ground Vehicles

Provisionally accepted
Yiming Luo Yiming Luo 1Jialin Wang Jialin Wang 1Yushan Pan Yushan Pan 1Shan Luo Shan Luo 2Pourang Irani Pourang Irani 3Hai-Ning Liang Hai-Ning Liang 1*
  • 1 Xi'an Jiaotong-Liverpool University, Suzhou, Jiangsu Province, China
  • 2 King's College London, London, England, United Kingdom
  • 3 University of British Columbia, Vancouver, British Columbia, Canada

The final, formatted version of the article will be published soon.

    First-person view (FPV) technology in virtual reality (VR) can offer in-situ environments in which teleoperators can manipulate unmanned ground vehicles (UGVs). However, non-experts and expert robot teleoperators still have trouble controlling robots remotely in various situations.For example, obstacles are not easy to avoid when teleoperating UGVs in dim, dangerous, and difficult-to-access areas with environmental obstacles, while unstable lighting can cause teleoperators to feel stressed. To support teleoperators' ability to operate UGVs efficiently, we adopted construction yellow and black lines from our everyday life as a standard design space and customised the Sobel algorithm to develop VR-mediated teleoperations to enhance teleoperators' performance. Our results show that our approach can improve user performance on avoidance tasks involving static and dynamic obstacles and reduce workload demands and simulator sickness. Our results also demonstrate that with other adjustment combinations (e.g., removing the original image from edge-enhanced images with a blue filter and yellow edges), we can reduce the effect of high-exposure performance in a dark environment on operation accuracy.Our present work can serve as a solid case for using VR to mediate and enhance teleoperation operations with a wider range of applications.

    Keywords: Vision augmentation, Edge enhancement, virtual reality, teleoperation, Unmanned ground vehicles

    Received: 29 May 2023; Accepted: 27 Aug 2024.

    Copyright: © 2024 Luo, Wang, Pan, Luo, Irani and Liang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Hai-Ning Liang, Xi'an Jiaotong-Liverpool University, Suzhou, 215123, Jiangsu Province, China

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.