The availability of commodity 3D tracking hardware has led to renewed interest in freehand gesture-based interaction interfaces. This is by no means surprising since research has already shown that the hands are the most natural and expressive input modality, with incredibly high information throughput. Nevertheless, the speed and dexterity of the human hands pose a strong challenge for any tracking approach, and designers are inevitably forced to find a compromise between the most natural and best recognizable gestures. Many investigations have addressed this challenge by developing strategies for elaborate eliciting studies, providing custom or user-specific gesture sets, conducting large-scale studies “in the wild”, or developing ingenious approaches to reveal the available built-in gestures. Unfortunately, the conceptual basis has barely evolved in the last 20 years and most approaches are still based on traditional paradigms, such as predefined gestures and postures mapped to commands. We believe that a better understanding of human hand-object interaction and the object’s affordance recognition would help to generate more robust and natural user interfaces that allow exploring the design space for future interactive systems based on natural hand interactions, i.e., without requiring specific gestures or skills.
The goal of this article collection is to bring together researchers from academia and industry from various areas concerning free-hand interaction in virtual reality, pervasive computing, psychology, human-robot interaction, or any other related field. Currently, the domain is rather fragmented with a tremendous amount of work published each year yet scattered across different venues and broadly focused journals. Thus, we welcome research within the particular topics focused on freehand interaction, such as grasp and touch in virtual/augmented reality, that crosses the borders of different domains and may foster discussion and future collaboration. The goal is to advance the conceptual groundwork in the area and enable the transfer of the results into praxis.
We seek original, unpublished papers documenting research contributions, practice, and experience, or systems and novel applications. Critical reviews, surveys, or eye-opener essays are also welcomed and appreciated. Papers are of variable length, which should match the level of detail of description for the research contribution. Reviews, essays, and conceptual contributions will be reviewed based on their originality and ability to provoke discussions and collaboration across domains. User interface evaluations and meta-evaluations are strongly encouraged for any scientific contribution. Additional information and formatting instructions can be found here. Specific topics include, but are not limited to:
- Freehand interaction techniques and paradigms in virtual and augmented reality
- Touch and multi-touch technologies for interaction with virtual objects
- Hand and finger tracking techniques and devices
- Tangible interaction using active, passive, or dynamic-passive proxies
- Hand redirection techniques
- Adaptive and perceptual-inspired spatial interaction interfaces
- Gesture- and micro gesture-based spatial interaction paradigms
- Kinematics-based hand grip and lift in virtual environments
The availability of commodity 3D tracking hardware has led to renewed interest in freehand gesture-based interaction interfaces. This is by no means surprising since research has already shown that the hands are the most natural and expressive input modality, with incredibly high information throughput. Nevertheless, the speed and dexterity of the human hands pose a strong challenge for any tracking approach, and designers are inevitably forced to find a compromise between the most natural and best recognizable gestures. Many investigations have addressed this challenge by developing strategies for elaborate eliciting studies, providing custom or user-specific gesture sets, conducting large-scale studies “in the wild”, or developing ingenious approaches to reveal the available built-in gestures. Unfortunately, the conceptual basis has barely evolved in the last 20 years and most approaches are still based on traditional paradigms, such as predefined gestures and postures mapped to commands. We believe that a better understanding of human hand-object interaction and the object’s affordance recognition would help to generate more robust and natural user interfaces that allow exploring the design space for future interactive systems based on natural hand interactions, i.e., without requiring specific gestures or skills.
The goal of this article collection is to bring together researchers from academia and industry from various areas concerning free-hand interaction in virtual reality, pervasive computing, psychology, human-robot interaction, or any other related field. Currently, the domain is rather fragmented with a tremendous amount of work published each year yet scattered across different venues and broadly focused journals. Thus, we welcome research within the particular topics focused on freehand interaction, such as grasp and touch in virtual/augmented reality, that crosses the borders of different domains and may foster discussion and future collaboration. The goal is to advance the conceptual groundwork in the area and enable the transfer of the results into praxis.
We seek original, unpublished papers documenting research contributions, practice, and experience, or systems and novel applications. Critical reviews, surveys, or eye-opener essays are also welcomed and appreciated. Papers are of variable length, which should match the level of detail of description for the research contribution. Reviews, essays, and conceptual contributions will be reviewed based on their originality and ability to provoke discussions and collaboration across domains. User interface evaluations and meta-evaluations are strongly encouraged for any scientific contribution. Additional information and formatting instructions can be found here. Specific topics include, but are not limited to:
- Freehand interaction techniques and paradigms in virtual and augmented reality
- Touch and multi-touch technologies for interaction with virtual objects
- Hand and finger tracking techniques and devices
- Tangible interaction using active, passive, or dynamic-passive proxies
- Hand redirection techniques
- Adaptive and perceptual-inspired spatial interaction interfaces
- Gesture- and micro gesture-based spatial interaction paradigms
- Kinematics-based hand grip and lift in virtual environments