AUTHOR=Paradise Andre , Surve Sushrut , Menezes Jovan C. , Gupta Madhav , Bisht Vaibhav , Jang Kyung Rak , Liu Cong , Qiu Suming , Dong Junyi , Shin Jane , Ferrari Silvia TITLE=RealTHASC—a cyber-physical XR testbed for AI-supported real-time human autonomous systems collaborations JOURNAL=Frontiers in Virtual Reality VOLUME=Volume 4 - 2023 YEAR=2023 URL=https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2023.1210211 DOI=10.3389/frvir.2023.1210211 ISSN=2673-4192 ABSTRACT=Artificial intelligence (AI) research on human-robot teams requires the ability to test software and algorithms for perception, decision-making, and collaboration in complex real-world settings that elude laboratory experiments. On the other hand, field experiments, also called "in the wild," do not provide ground truth at the level of detail necessary for performance comparison and validation. Experiments on pre-recorded real-world data sets, such as video or sensor data passively collected in the wild, are also significantly limited in their usefulness because they do not allow testing the effectiveness of active perception and control or decision strategies in the loop.Furthermore, the cost of utilizing a large number of robots and/or humans in any given experiment is often infeasible even for industry and may result in huge losses when environmental conditions or other factors disrupt testing. The novel Real-Time Human Autonomous Systems Collaborations (RealTHASC) facility at Cornell University interfaces real and virtual robots and humans with photo-realistic simulated environments by implementing new concepts for the seamless integration of wearable sensors, motion capture, physics-based simulation, robot hardware and virtual reality (VR). The result is an extended reality (XR) testbed by which real robots and humans in the laboratory are able to experience virtual worlds, inclusive of virtual agents, through real-time visual feedback and interaction. Virtual reality 3D body tracking is employed in conjunction with a rigid-body motion capture system to transfer each real human/robot agent from the real world into a synthetic virtual environment, constructing a corresponding avatar that not only mimics the behavior of its real-world counterpart in real time but also experiences the virtual world and transmits sensed information back to the real human/robot agent. New cross-domain synthetic environments are created in RealTHASC to bridge the simulation-to-reality gap and allow for the inclusion of underwater/ground/aerial autonomous vehicles equipped with a multi-modal sensor suite. The experimental capabilities offered by RealTHASC are demonstrated through three case 1 studies showcasing real/virtual interactions in diverse domains, leveraging and complementing the benefits of experimenting in simulation and in the real world.