- 1State Key Laboratory of Robotics and Systems (HIT), Harbin, China
- 2The Key Laboratory of Road Construction Technology and Equipment of MOE, Chang'an University, Xi'an, China
- 3Bio-Inspired Robotics and Neural Engineering Laboratory, School of Information Science and Technology, Vidyasirimedhi Institute of Science and Technology, Rayong, Thailand
- 4Embodied AI and Neurorobotics Laboratory, SDU Biorobotics, The Mærsk Mc-Kinney Møller Institute, University of Southern Denmark, Odense, Denmark
- 5School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China
Editorial on the Research Topic
Multimodal behavior from animals to bio-inspired robots
Introduction
Animals exhibit multimodal behavior, which means that they can behave in different motion patterns, such as crawling, running, jumping, hunting, and escaping. Quadrupeds can maintain balance and coordination in challenging terrains. Amphibians can transit on land and in water. Insects can freely fly and crawl. These innate locomotion talents are simple and natural to animals and enable them to survive in nature by efficiently exploiting their body morphology and neural control system. Thus, it is very important to investigate the mechanism and influencing factors of multimodal behavior, especially for the development of dexterous, agile, and stable robots through biologically inspired guidance.
Various attempts, including the use of hybrid structure systems, neural control, and embodied intelligence, have been made to mimic such impressive multimodal behaviors for robots, e.g., the turtle robot (Baines et al., 2022), salamander robot (Ijspeert et al., 2007), gecko robot (Shao et al., 2022), quadruped robot (Miki et al., 2022), millipede robot (Homchanthanakul and Manoonpong, 2023), and dung beetle robot (Xiong and Manoonpong, 2021). However, in comparison to their biological counterparts, robots still fail in terms of adaptability, flexibility, and versatility, because multimodal behaviors involve the synthesis of structure, control, planning, and optimization. Although the multimodal concept, previously realized in vehicle engineering and electrical engineering, will greatly improve the agility and intelligence of bio-inspired robots' behavioral control, it also poses new challenges. For example, the muscle tissue and body morphology of an animal are quite different from the actuators of robots, and the animal's neural system is also much more complex than the control system of robots. To bridge the gap between biology and bio-inspired robots, these factors must be taken into account (Manoonpong et al., 2021). Therefore, further research and discussions relating to the multimodal concept of robot behavior still need to be conducted.
The goal of this Research Topic is to demonstrate the recent progress in relation to “Multimodal Behavior from Animals to Bio-Inspired Robots,” including the design of the mechanical structure, sensory systems, control strategy, and enhancements between biology and engineering. The topic contains four articles, each addressing the control methods for the multimodal behavior of bionic robots with different structures, including biped and hexapod robots, as well as sensor systems and reinforcement learning methods for multimodal behavior.
Multimodal behavior
Researchers have developed many bionic robots, with the configuration and locomotion mostly inspired by animals, e.g., the typical gaits (walking and running) of quadruped robots (Lee et al., 2020), flying robots with complex wing dynamics (Ramezani et al., 2017), and a swimming robot undulating its body and/or flapping its fins (Ijspeert, 2014). These behaviors not only bring a unique perspective to research on biology and bionic robotics but also pose substantial technological challenges for robot modeling, design, and control.
Structure and body morphology
For bionic robots, different structures and morphology should be developed and designed according to the application scenarios. Thus, the topics of concern and unsolved problems will also vary accordingly. In this Research Topic, Zhao et al. focus on exploring the effects of the serial and parallel elasticity on hopping with a two-segmented robotic leg called an electric-pneumatic actuation (EPA)-Hopper. The performance of the serial and parallel pneumatic artificial muscles (PAM) is evaluated with respect to four criteria: efficiency, performance, stability, and hopping robustness against perturbations. The serial PAM has a more pronounced impact than the parallel PAM on these criteria and can help to further understand the human hopping mechanism and support the design and control of legged robots and assistive devices. Strohmer et al. developed a spiking neural network capable of continuously changing amplitude, frequency, and phase online. The network is deployed to observe the meaningful role of network topology in locomotion on the Modular Robot Framework (MORF), which is both a simulated robot in V-REP and a physical hexapod robot with 18 degrees of freedom. The neural control of locomotion relates back to biology where it can provide theoretical evidence that is not currently testable on behaving insects.
Behavioral control strategy
The high dynamics control strategy of the bio-inspired robot is impressive and has a history of technology, such as a non-linear program (NLP) (Jenelten et al., 2022), Linear Quadratic Regulator (LQR) (Klemm et al., 2020), and Model Predictive Control (Di Carlo et al., 2018). Besides model-based control, robots driven by neural control architectures, that can respond to internal and external information and implement various behaviors, have superior upper-level capabilities in terms of behavior pattern diversity with smooth transition (Zhu et al., 2022) as well as versatile locomotion on complex terrains (Picardi et al., 2020; Thor and Manoonpong, 2022). In this Research Topic, Bonzon defines a neural basis for such behaviors, potentially learned by bio-inspired artifacts. The stochastic decision tree that drives this behavior is then transformed into a plastic neuronal circuit. The principle of using synchronized multimodal perceptions in association with the Hebb principle of wiring together neuronal cells is induced. The emergence of behaviors allows for a strict delineation of successive complexity levels. The isolation of levels allows for simulating yet unknown processes of cognition independently of their underlying neurological grounding. Koseki et al. provide a framework that reproduces multimodal bipedal locomotion using passive dynamics through deep reinforcement learning (DRL). By carefully planning the weight parameter settings of the DRL reward function during the learning process based on a curriculum learning method, the bipedal model successfully learned to perform multimodal behaviors (walk, run, and gait transitions). These results indicate that DRL can be applied to generate various gaits with the effective use of passive dynamics.
Concluding remarks
The Research Topic presents multimodal behavior approaches of bio-inspired robots, considering mechanical design, sensory integration systems, control strategies, and applications in different environments. The studies on this topic cover multimodal behaviors for the hopping robot, biped robot, and hexapod robot. The results from these studies confirm that the sensory system provides support for the decision-making of multimodal behaviors. Furthermore, the neural control architecture can easily realize a variety of gait patterns, while the reinforcement learning and control methods ensure the stability and reliability of the related locomotion. Given these and further studies, robots with delicate structures and systems are likely to have more complex multimodal behavior and can be more similar to the behavior mechanisms and physiological functions of animals in the future.
Author contributions
All authors listed have made a substantial, direct, and intellectual contribution to the work and approved it for publication.
Acknowledgments
We acknowledge support from the National Natural Science Foundation of China (Grant No. 51605039), the State Key Laboratory of Robotics and Systems (HIT) (Grant No. SKLRS-2023-KF-05), and the Fundamental Research Funds for the Central Universities (Grant No. 300102259308). PM acknowledges support from Vidyasirimedhi Institute of Science and Technology (VISTEC)-research funding on the BrainBot project (I22POM-INT010). The funders had no role in the study design, data collection, analysis, decision to publish, or preparation of the manuscript.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
Baines, R., Patiballa, S. K., Booth, J., Ramirez, L., Sipple, T., Garcia, A., et al. (2022). Multi-environment robotic transitions through adaptive morphogenesis. Nature 610, 283–289. doi: 10.1038/s41586-022-05188-w
Di Carlo, J., Wensing, P. M., Katz, B., Bledt, G., and Kim, S. (2018).“Dynamic locomotion in the MIT Cheetah 3 through convex model-predictive control,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (Madrid), 1–9.
Homchanthanakul, J., and Manoonpong, P. (2023). Proactive body joint adaptation for energy-efficient locomotion of bio-inspired multi-segmented robots. IEEE Robot. Autom. Lett. 8, 904–911. doi: 10.1109/LRA.2023.3234773
Ijspeert, A. J. (2014). Biorobotics: using robots to emulate and investigate agile locomotion. Science 346, 196–203. doi: 10.1126/science.1254486
Ijspeert, A. J., Crespi, A., Ryczko, D., and Cabelguen, J. M. (2007). From swimming to walking with a salamander robot driven by a spinal cord model. Science 315, 1416–1420. doi: 10.1126/science.1138353
Jenelten, F., Grandia, R., Farshidian, F., and Hutter, M. (2022). TAMOLS: Terrain-aware motion optimization for legged systems. IEEE Trans. Robot. 38, 3395–3413. doi: 10.1109/TRO.2022.3186804
Klemm, V., Morra, A., Gulich, L., Mannhart, D., Rohr, D., Kamel, M., et al. (2020). LQR-assisted whole-body control of a wheeled bipedal robot with kinematic loops. IEEE Robot. Autom. Lett. 5, 3745–3752. doi: 10.1109/LRA.2020.2979625
Lee, J., Hwangbo, J., Wellhausen, L., Koltun, V., and Hutter, M. (2020). Learning quadrupedal locomotion over challenging terrain. Sci. Robot. 5, eabc5986. doi: 10.1126/scirobotics.abc5986
Manoonpong, P., Patanè, L., Xiong, X., Brodoline, I., Dupeyroux, J., Viollet, S., et al. (2021). Insect-inspired robots: bridging biological and artificial systems. Sensors 21, 7609. doi: 10.3390/s21227609
Miki, T., Lee, J., Hwangbo, J., Wellhausen, L., Koltun, V., and Hutter, M. (2022). Learning robust perceptive locomotion for quadrupedal robots in the wild. Sci. Robot. 7, eabk2822. doi: 10.1126/scirobotics.abk282
Picardi, G., Chellapurath, M., Iacoponi, S., Stefanni, S., Laschi, C., and Calisti, M. (2020). Bioinspired underwater legged robot for seabed exploration with low environmental disturbance. Sci. Robot. 5, eaaz1012. doi: 10.1126/scirobotics.aaz1012
Ramezani, A., Chung, S. J., and Hutchinson, S. (2017). A biomimetic robotic platform to study flight specializations of bats. Sci. Robot. 2, eaal2505. doi: 10.1126/scirobotics.aal2505
Shao, D., Wang, Z., Ji, A., Dai, Z., and Manoonpong, P. (2022). A gecko-inspired robot with CPG-based neural control for locomotion and body height adaptation. Bioinspir. Biomim. 17, 036008. doi: 10.1088/1748-3190/ac5a3c
Thor, M., and Manoonpong, P. (2022). Versatile modular neural locomotion control with fast learning. Nat. Mach. Intell. 4, 169–179. doi: 10.1038/s42256-022-00444-0
Xiong, X., and Manoonpong, P. (2021). No need for landmarks: an embodied neural controller for robust insect-like navigation behaviors. IEEE Trans. Cybern. 52, 12893–12904. doi: 10.1109/TCYB.2021.3091127
Keywords: multimodal behavior, bio-inspired robot, locomotion control, embodied intelligence, multimodal sensory integration
Citation: Zhu Y, Manoonpong P and Hu Q (2023) Editorial: Multimodal behavior from animals to bio-inspired robots. Front. Neurorobot. 17:1160549. doi: 10.3389/fnbot.2023.1160549
Received: 07 February 2023; Accepted: 10 February 2023;
Published: 21 February 2023.
Edited and reviewed by: Alois C. Knoll, Technical University of Munich, Germany
Copyright © 2023 Zhu, Manoonpong and Hu. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Yaguang Zhu, emh1eWFndWFuZyYjeDAwMDQwO2NoZC5lZHUuY24=