AUTHOR=Kolkhorst Henrich , Veit Joseline , Burgard Wolfram , Tangermann Michael TITLE=A Robust Screen-Free Brain-Computer Interface for Robotic Object Selection JOURNAL=Frontiers in Robotics and AI VOLUME=7 YEAR=2020 URL=https://www.frontiersin.org/journals/robotics-and-ai/articles/10.3389/frobt.2020.00038 DOI=10.3389/frobt.2020.00038 ISSN=2296-9144 ABSTRACT=

Brain signals represent a communication modality that can allow users of assistive robots to specify high-level goals, such as the object to fetch and deliver. In this paper, we consider a screen-free Brain-Computer Interface (BCI), where the robot highlights candidate objects in the environment using a laser pointer, and the user goal is decoded from the evoked responses in the electroencephalogram (EEG). Having the robot present stimuli in the environment allows for more direct commands than traditional BCIs that require the use of graphical user interfaces. Yet bypassing a screen entails less control over stimulus appearances. In realistic environments, this leads to heterogeneous brain responses for dissimilar objects—posing a challenge for reliable EEG classification. We model object instances as subclasses to train specialized classifiers in the Riemannian tangent space, each of which is regularized by incorporating data from other objects. In multiple experiments with a total of 19 healthy participants, we show that our approach not only increases classification performance but is also robust to both heterogeneous and homogeneous objects. While especially useful in the case of a screen-free BCI, our approach can naturally be applied to other experimental paradigms with potential subclass structure.