The final, formatted version of the article will be published soon.
ORIGINAL RESEARCH article
Front. Robot. AI
Sec. Human-Robot Interaction
Volume 11 - 2024 |
doi: 10.3389/frobt.2024.1461615
This article is part of the Research Topic AI-Powered Musical and Entertainment Robotics View all 5 articles
Music, Body, and Machine: Gesture-based Synchronization in Human-robot Musical Interaction
Provisionally accepted- Center for Music Technology, College of Design, Georgia Institute of Technology, Atlanta, Georgia, United States
Musical performance relies on nonverbal cues for conveying information among musicians. Human musicians use bodily gestures to communicate their interpretation and intentions to their collaborators, from mood and expression to anticipatory cues regarding structure and tempo. Robotic Musicians can use their physical bodies in a similar way when interacting with fellow musicians. The paper presents a new theoretical framework to classify musical gestures and a study evaluating the effect of robotic gestures on synchronization between human musicians and Shimon - a robotic marimba player developed at Georgia Tech. Shimon utilizes head and arm movements to signify musical information such as expected notes, tempo, and beat. The study, in which piano players were asked to play along with Shimon, assessed the effectiveness of these gestures on human-robot synchronization. Subjects were evaluated for their ability to synchronize with unknown tempo changes as communicated by Shimon's ancillary and social gestures. The results demonstrate the significant contribution of non-instrumental gestures to human-robot synchronization, highlighting the importance of non-music-making gestures for anticipation and coordination in human-robot musical collaboration. Subjects also indicated more positive feelings when interacting with the robot's ancillary and social gestures, indicating the role of these gestures in supporting engaging and enjoyable musical experiences.
Keywords: human-robot interaction, synchronization, Robotic gestures, Robotic musicianship, robots, Music
Received: 12 Sep 2024; Accepted: 07 Nov 2024.
Copyright: © 2024 Gao, Rogel, Sankaranarayanan, Dowling and Weinberg. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence:
Xuedan Gao, Center for Music Technology, College of Design, Georgia Institute of Technology, Atlanta, 30332, Georgia, United States
Amit Rogel, Center for Music Technology, College of Design, Georgia Institute of Technology, Atlanta, 30332, Georgia, United States
Gil Weinberg, Center for Music Technology, College of Design, Georgia Institute of Technology, Atlanta, 30332, Georgia, United States
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.