Skip to main content

METHODS article

Front. Virtual Real., 08 June 2022
Sec. Technologies for VR
This article is part of the Research Topic Electrical Stimulation for Immersive Virtual and Augmented Reality View all 5 articles

Holitouch: Conveying Holistic Touch Illusions by Combining Pseudo-Haptics With Tactile and Proprioceptive Feedback During Virtual Interaction With 3DUIs

  • 1HCI Group, Universität Hamburg, Hamburg, Germany
  • 2German Research Center for Artificial Intelligence (DFKI), Saarland University, Saarbrücken, Germany

Virtual reality technology and immersive virtual environments often support realistic hand representations via hand-posture-sensing controllers or hand tracking for natural hand-based interaction. However, one limiting factor remains as the lack of realistic haptic feedback including tactile and proprioceptive cues; even for simple haptic interactions like touching a virtual object. This paper introduces the Holitouch technique to improve the haptic realism of essential 3D user interface elements such as buttons. Holitouch is a feedback technique based on a wearable device that combines different types of haptic feedback (i.e., 1) pseudo-haptic, 2) tactile, and 3) proprioceptive) to convey the holistic sensation of stiffness, contact, and activation while interacting with 3D buttons. Our approach provides these sensations by utilizing redundant multisensory cues, i.e., congruent feedback, to create plausible illusions of touch. The results of two experiments show that the proposed feedback combination contributes to delivering a holistic sensation when interacting with buttons in VR while having high user acceptance.

1 Introduction

Recent advancements regarding the quality and realism of visual as well as auditory feedback in immersive virtual environments (IVEs) have been enormous, and audio-visual IVE experiences became quite realistic. Conversely, the provision of realistic haptic stimulation for virtual reality (VR) still requires the implementation of complex haptic displays (Sutherland, 1965), to provide meaningful and realistic tactile as well as proprioceptive feedback. However, recent developments in the area of haptic displays, especially hand wearables, have shown a significant shift in the last years1.

The relevance and importance of the hands for interaction are depicted in the cortical homunculus (Reed and Ziat, 2018), which shows that a main proportion of the human brain is dedicated to processing motor and sensory functions. Consequently, the human hands evolved into a universal tool for interaction. Modern VR systems typically support user hand representations via hand-posture-sensing controllers or hand tracking, displaying virtual hands that match the size, position, orientation, and pose of the real counterparts; increasing the sense of presence and embodiment (Yuan and Steed, 2010; Argelaguet et al., 2016). It also applies in the context of interaction, as virtual hands are essential for natural 3D user interfaces (3DUIs) with realistic virtual objects (Verschoor et al., 2020). The combination of virtual hands and haptic displays close the bridge between input and feedback as the users can reach out and manipulate virtual objects, and then perceive tactile and proprioceptive feedback.

While most of the research on haptics has produced promising results, it currently faces challenges related to the often required complex, expensive, and grounded equipment. This situation evidences a gap with consumer VR and the growing requirements of convincing haptic feedback for interaction with basic 3DUI widgets such as buttons or menus. Hence, it is desired to develop novel means to compensate for the missing tactile and kinesthetic cues when users interact with virtual and “intangible” 3DUI elements. For these reasons, there is a need for wearable (and consumer-friendly) feedback solutions, which can simulate sensations like contact and stiffness and provide on-skin sensations without the limitations of grounded technologies (i.e., limited spatial workspace) while still enabling realistic touch interactions.

In this work, we propose such a solution by presenting a non-intrusive feedback approach based on a wearable device that can be implemented in a consumer-friendly form factor and leverages multimodal cues. Our approach combines pseudo-haptics, tactile, and proprioceptive feedback to convey sensations for stiffness, surface contact, and activation, respectively, while interacting with realistic 3DUIs in VR. While previous work has provided compelling touch illusions in IVEs, there has not been any solution yet, which convey holistic sensations for 3DUI elements. We conducted two experiments in the context of interacting with 3D buttons to evaluate the proposed Holitouch technique. 3D buttons are simple widgets for holistic haptic sensations, as they are basic 3DUI elements with clear affordances. Typically, users are aware of various types of buttons (e.g., with different internal mechanics) due to livelong experience in operating buttons in the real world such as keyboard keys or light switches. Moreover, in principle, the considered haptic feedback provided by 3D buttons can potentially be generalized to other virtual 3D objects in VR. In particular, haptic properties such as stiffness, softness, hardness, and contact, are also important for VR-based training, simulation or entertainment.

With our proposed Holitouch technique, a user can reach towards a virtual button in VR. When pressing the button, she receives both a tapping sensation when the virtual finger collides with the surface of the virtual button, while also feeling a resistance induced by electrical stimulation of the finger tendons. As the user continues the interaction and further presses the button down, our approach employs a pseudo-haptic technique informed through psychophysical user experiments. Specifically, our approach applies a gain factor to the virtual hand movement in order to realistically convey the stiffness properties of the button mechanics. Moreover, tactile feedback on the index fingertip complements the contact sensation with cues informing the activation of the button (i.e., internal click mechanism).

To summarize, this paper makes the following contributions:

• Introduction of a novel visual-haptic technique, called Holitouch, which simulates the stiffness of 3DUI interface elements in VR. Our approach combines pseudo-haptic manipulation of the C/D ratio, tactile tapping sensations, proprioceptive feedback based on electrical tendon stimulation, and vibrotactile feedback to communicate the sensations related to displacement and activation of a 3D button.

• Report of the results from two perceptual user experiments, which we conducted to inform the parameterization of the combination of pseudo-haptics, tactile, and proprioceptive feedback for 3DUIs.

• Discussion of guidelines for researchers and practitioners on how to convey feedback about stiffness, contact, and activation in 3DUIs.

• Provision of all required resources to replicate the proposed device under an open soft- and hardware repository.

This work is motivated by the methodology of Specificity introduced by Bowman and Frohlich (2005). It states that instead of using generalized 3D interaction techniques, more specific approaches should be designed, implemented, and evaluated to take advantage of novel 3D technologies and prior knowledge on how to map between I/O devices and interaction in IVEs. In particular, we developed our Holitouch technique with a specific focus on the task- and device-specific aspects, i.e., using our haptic device for 3D button presses, in order to improve sense of presence, sense of agency, and user performance (Martinez Cornelio et al., 2017), also taking advantage of how our hands have evolved for manipulating objects (Napier and Tuttle, 1993).

The remainder of this paper is structured as follows. In Section 2, we present a review of related work and how we define our approach on top of the presented contributions. Section 3 describes the apparatus for the Holitouch technique; the general setup for the experiments and how we build its hardware and software components. Then, Section 4 presents our methods for two experiments on 1) using pseudo-haptics and psychophysics to convey a realistic stiffness illusion, including a model to quantify the perceived stiffness, and 2) taking the resulting stiffness illusion and aggregate contact and activation sensations by the combination of pseudo-haptic, tactile, and proprioceptive feedback. Section 5 shows the main results from the experiments. The paper ends with a general discussion and a review of the proposed technique’s limitations (cf. Section 6).

2 Related Work

Our research builds on several related work areas, particularly visuo-haptic illusions, electrical muscle/tendon stimulation, multisensory feedback, haptic wearables, and button design, which will be reviewed in the following subsections.

2.1 Visuo-Haptic Illusions

As the visual sense often provides the most reliable cue, it often dominates other senses in cases of conflicting multisensory input. In this context, discrepancies between visual and tactile/proprioceptive feedback may go unnoticed, relying mostly on the visual perception during the multisensory integration process (Rock and Victor, 1964). This characteristic of the human perceptual system facilitates three main categories of visuo-haptic illusions for VR: retargeting, redirection, and scaling virtual movements with a control to display (C/D) ratio.

Haptic retargeting uses physical objects to serve as haptic proxies for multiple virtual objects. However, the properties of physical objects such as size and shape must be identical [or at least similar (Nilsson et al., 2021)] to its virtual counterparts to achieve a compelling haptic feedback. Azmandian et al. studied various ways to either manipulate the representation of the virtual hand or to warp the virtual world, or to combine both approaches, in order to bridge dislocations between physical prop and virtual counterparts (Azmandian et al., 2016; Cheng et al., 2017). Furthermore, they explored the use of ungrounded haptic retargeting with handheld devices to simulate different tools (Yang et al., 2018).

In a similar way, hand redirection has been exploited to create illusions of different shapes during the exploration of the surfaces of virtual objects. Kohli et al. achieved this by dynamically manipulating both the position of the virtual hand and the virtual representation of a physical object providing passive haptics (Kohli, 2013). The concept was initially proposed for redirected walking (Steinicke et al., 2010), creating the illusion of walking along a real path that is visually incongruent with the virtual one. Lower-bound detection thresholds for unnoticed hand redirection have been studied (Zenner and Krüger, 2019), contributing thresholds for vertical, horizontal, and gain-based hand offsets for conservative and realistic IVEs. The redirection has also been applied as finger translation gains to manipulate the perceived size of real objects during a grasping gesture (Bergström et al., 2019).

A central parameter of the third visual-haptic illusion that has received much research attention in the HCI and VR communities recently is the concept of the control-to-display (C/D) ratio. The C/D ratio is the proportion between the displacement of the real and the virtual hand. Lécuyer et al. studied how increasing or decreasing the radio can be used to provide pseudo-haptic feedback (Lecuyer et al., 2000; Lecuyer and Burkhardt, 2005), modulating the perceived physical properties of a virtual object (e.g., sliding a finger over materials with a C/D ratio greater than one creates the illusion of higher friction). This approach is often used as a mechanism for hand redirection (Abtahi and Follmer, 2018), enabling passive shape displays to manipulate the perceived resolution. Additionally, research by Achibet et al. explored the combination of pseudo and passive haptic feedback with a Virtual Mitten to create an illusion of grip and elastic forces with spring devices (Achibet et al., 2014; Achibet et al., 2017), and the combination of pseudo-haptics and redirected touching with an Elastic Arm to provide human-scale passive feedback as progressive resistance force when extending the arm (Achibet et al., 2015).

Regarding the use of C/D ratio manipulation to change the perception of physical properties in IVEs, recent work induced genuine weight and mass-distribution perception without kinesthetic feedback while preserving the sense of ownership of the virtual hand and producing high levels of presence, immersion, and enjoyment (Rietzler et al., 2018; Samad et al., 2019; Yu and Bowman, 2020). In the case of stiffness or compliance perception, research has led to the exploration of haptic displays based on multisensory feedback to compensate for the limitations of traditional haptic devices in bandwidth and exerted force. Visual cues can be used to modify or deform the visual dimensions to simulate stiffness, taking advantage of visuo-haptic cross-modal transfers experienced by IVE users (Lecuyer et al., 2000; Biocca et al., 2001; Lécuyer, 2009). Our approach built upon the concept of pseudo-haptics by C/D ratio manipulation to induce the perception of stiffness while pressing a virtual button.

2.2 Electrical Stimulation

Electrical muscle stimulation (EMS) or electromyostimulation is the elicitation of muscle contraction using electric impulses. Tamaki et al. (2011) introduced EMS for controlling the user’s hand (and its 16 joints) by applying electrical stimulus to the muscles around the forearm, generating controlled movements for the extension, flexion, adduction, and abduction. This original work evolved into a multi-channel wearable device supporting fine-grained hand movements (Duente et al., 2017). In recent work, muscles were stimulated to simulate a limb being hit by pushing or pulling selected muscles to create a compelling sensation of impact, inducing a proprioceptive reaction as muscle-propelled force feedback (Lopes and Baudisch, 2013; Lopes et al., 2015). EMS in multiple arm muscles can be used to add haptic feedback in IVEs to prevent virtual hands from passing through walls and heavy objects, creating a counterforce that pulls the user’s arm backward (Lopes et al., 2017). A similar setup has been used in cutscenes to enhance the storytelling with higher perceived presence and realism (i.e., hand-shake sensations) (Khamis et al., 2019).

In mid-air interaction with 3D objects, like selection and manipulation, traditionally, the illusion of touch has been provided by visual or auditory feedback; more recently, vibrotactile feedback has been used as well. When vibrations are replaced by or combined with EMS to supply cues regarding contact and material texture, the users reported better experiences as EMS offered a more comprehensive range of strengths and qualities of haptic feedback (Pfeiffer et al., 2014).

Alternatively, force sensation could also be generated by tendon electrical stimulation (TES) on the skin surface, stimulating sensory nerves instead of motor nerves (i.e., EMS). TES works around any joint where muscles do not overlap the tendon. Also, TES can present a force sensation of around 250 gf and deliver proprioceptive feedback without generating muscle contractions, which is convenient for IVEs with limited interaction space.

Holitouch uses TES when tapping a 3D button in mid-air, the sensation of contact has been conveyed by combining tactile feedback and TES. Previous work combined TES with electrotactile feedback, using psychophysical tools to deal with different inter-delays and perceive the two stimulations as simultaneous (Miyamoto et al., 2015), however, the cited approach was not combined with pseudo-haptics in any way. In our case, we want to elicit a sensation of a resistance force provided by the button, which is opposite to the finger displacement direction. To accomplish this, related studies (Takahashi et al., 2018) showed that the combination of TES, visual and vibration stimulation enables the adequate perception of force using multimodal feedback, and also found that the direction of the force sensation is opposite to the motion elicited by electrical stimulation. Also, TES is suitable for momentary sensations (e.g., the short sensation due to the finger-button collision) when properly combined with other modalities (Takahashi et al., 2019). Additionally, electrical stimulation can be actuated in the back of the hand, using specific electrode layouts to increase dexterity and interactivity (Takahashi et al., 2021).

2.3 Multimodal Feedback

Multimodal feedback in IVEs must be adequately and timely provided to avoid the uncanny valley of haptics (Berger et al., 2018). Spatio-temporal issues could decrease the subjective experience’s quality if the IVE fails to keep up with the increased expectations when the user is provided with complex haptic cues. The sense of presence can be affected by using multimodal feedback in IVEs, increasing in this way the subject attribution of credibility, especially with the use of vibrotactile feedback combined with passive haptics (Goncalves et al., 2019). Studies in multimodal feedback have highlighted how the addition of modalities to the traditional visual feedback can reduce sensory conflicts, improve the sense of presence in IVEs, and improve user performance. Incorporating mechanoreceptive feedback is essential in tasks involving virtual hands manipulating virtual objects. It facilitates the perception of contact when collisions with virtual objects are aligned with the most salient sensory feedback (Lee et al., 2017). Additional haptic feedback also reduces the detection of visuo-proprioceptive illusions because it inflates the detection threshold (Lee et al., 2015). Former research investigated the effect of visual, auditory, and proprioceptive (i.e., force-displacement model) feedback but not any tactile cues while pressing a virtual button. It found that unexpected effects related to underestimation may emerge when feedback modalities are combined (Faeth and Harding, 2014).

2.4 Haptic Wearables

The most appropriate input device for an interactive task must rely on the perceptual structure of the task and the control properties of the device (Jacob et al., 1994). Thus, wearable finger-based haptics with tactile and proprioceptive feedback allows us to duplicate real-world tasks relying on hand dexterity in IVEs. Following this guideline, we present some examples of related technologies supporting interactions on IVEs with virtual hands and using wearables providing active haptic feedback that responds to user’s motions either related or similar to the interaction with a 3D button. User interactions must have clear states for initiation, performance, and completion of tasks, and the feedback must eliminate ambiguity between states so the users can perform successful interactions more efficiently.

Choi et al. evaluated CLAW, a handheld haptic controller for grasping, touching, and triggering in VR that can render variable stiffness for rigid and soft materials (Choi et al., 2018). The Haptic Links controllers developed by Strasnick et al., can provide variable stiffness actuation to enable general-purpose bi-manual feedback in VR (Strasnick et al., 2018). The perceptions of softness-hardness and stickiness while touching virtual objects can be provided with wearables enabled with electrotactile feedback (Yem et al., 2018). Hwan et al. created HapThimble, a wearable device to enable interaction with virtual touch screens delivering tactile, pseudo-force, and vibrotactile feedback (Kim et al., 2016). Handheld controllers like the Haptic Revolver and the RollingStone can provide touch sensations when a user touches a virtual surface, simulating different sliding and friction effects with a wheel mechanism (Lo et al., 2018; Whitmire et al., 2018). Schorr et al. investigated how tactile devices for the fingertip can support virtual object manipulation and exploration such as grasping, squeezing, pressing, lifting, and stroking (Schorr and Okamura, 2017). Sinclair et al. developed and evaluated CapstanCrunch, a palm-grounded haptic controller to render force resisting feedback for touching and grasping in VR (Sinclair et al., 2019). A self-contained worn system with retractable wires, called Wireality, can lock the fingers to provide convincing tangible interactions for contact with complex geometries (Fang et al., 2020). Previous research studied stiffness discrimination with wearable vibrotactile devices (Maereg et al., 2017). A handheld device (PaCaPa) can provide proprioceptive feedback to render size, shape, and stiffness in VR (Sun et al., 2019). Tasbi and Vibero feature squeeze and vibrotactile feedback combined with pseudo-haptics to provide sensations of contact and stiffness in VR (Pezent et al., 2019, 2020; Adilkhanov et al., 2020). Finally, ultrasound as a mid-air free-hand technology, is able to simulate varying stiffness sensations in VR (Marchal et al., 2020).

2.5 3D Button Design

In traditional 3DUIs, a button as an input element can be activated by a collision between simple colliders assigned with the button and the user’s hand. Finally, it provides vibrotactile cues, for instance, delivered via a controller. However, its realistic simulation is much more complex as the haptic feedback rendering relies on force-displacement curves and velocity profiles (Kim and Lee, 2013; Liao et al., 2018). In this way, is it possible to provide timely sensations for the button activation (Kim et al., 2018), the press-release movements, as well as vibration and velocity-dependent characteristics (Liao et al., 2020). The same approach has been used in IVEs, simulating buttons providing pseudo-forces and tactile feedback (Kim et al., 2016).

In mid-air interactions, it is possible to represent realistic buttons by always projecting the virtual fingertip onto the button surface to avoid incongruent visual interpenetrations, following the “God Object” constraint (Zilles and Salisbury, 1995; Harwin and Melder, 2002). Regarding the neuromechanics of a button, it is possible to predict the user’s performance based on motor-control models and mechanical design parameters, minimizing the error between the expected and perceived button activations (Oulasvirta et al., 2018). Moreover, traditional input methods with 3D buttons (i.e., keyboards) in IVEs, with matching VR representations and providing passive haptic feedback, have proven to deliver intuitive interactions and improve user performance (Schneider et al., 2019). Also for mid-air VR interaction, virtual buttons can be represented with multi-modal cues (i.e., audio, visual, and haptic) to evaluate sensation for press depth and keystroke performance (Bermejo et al., 2021). Integration of pseudo-haptic features have also been studied to evaluate embodiment, sense of reality, spatio-temporal perception, satisfaction, and hedonic quality, as well as protrusion and hit effects based on proximity feedback (Kim and Xiong, 2021).

In summary, there has been numerous work recently, which focuses on providing compelling touch illusions in IVEs. Similar approaches have studied prototypes for holistic haptic sensations based on both kinesthetic and tactile displays, integrating vibrotactile, thermal, and wrist/finger kinesthetic stimuli (Kammermeier et al., 2004). However, this approach is focused on object identification for telepresence by using grounded actuators. Also relevant, the combination of EMS and pseudo-haptics have been explored to create the proprioceptive illusion of weight in VR (Kim et al., 2022). Nonetheless, we are not aware of previous research that introduced non-intrusive all-in-one wearable haptic technology and multimodal cues, which combine a pseudo-haptics predictive model, proprioceptive feedback, as well as tactile actuation for pressure and vibration, to finally convey a holistic synchronized sensation of stiffness, contact-with, and activation when interacting with 3DUI elements.

3 Materials

Our approach simultaneously provides pseudo-haptic, tactile feedback, proprioceptive feedback to convey a plausible illusion oftouch when interacting with virtual 3D buttons. The interaction is based on three sensations:

1. Contact. Mid-air direct interaction happens during the arm’s ballistic reach to approach the button until the index finger touches the virtual button. Holitouch conveys the sensation of contact with tactile and proprioceptive feedback, providing both a fingertip tap and electrical stimulation in the finger tendons.

2. Stiffness. Whenever the user is pressing/releasing the virtual button, the Holitouch system provides the sensation of pseudo-haptic stiffness by visually modulating the arm’s movement towards the button, redirecting the virtual hand to simulate stiffer and softer buttons.

3. Activation. While the user displaces the finger until it goes beyond the virtual button’s operation threshold (i.e., click), Holitouch provides vibrotactile cues at the fingertip. These cues are generated by frequency modulation that mimics the force-displacement curve of a real button.

In order to evaluate how these sensations contribute to conveying a holistic touch illusion, we developed a wearable device (for contact and activation sensations) and a pseudo-haptic technique (for the stiffness sensation).

3.1 Wearable Device

The wearable device (see Figure 1A) facilitates the user with feedback for the sensations of contact and activation, enabling tactile feedback (i.e., tapping feedback and vibrotactile feedback by piezo actuation), and proprioceptive feedback by electric stimulation (i.e., TES). The device is compatible with tracking systems (i.e., OptiTrack Prime 13W, Oculus Quest 1/2), enabling the system to trigger appropriate feedback whenever it detects a finger collision or interaction (i.e., press/release) with a virtual button. The device makes part of a framework for haptic experiments, providing a Unity plugin to wirelessly control solenoids, piezo elements, audio-based haptic transducers, and EMS/TES electrodes. The actuator-to-PC latency is approximately 28 ms. We published all the required resources for the experiment replicability under an open soft/hardware repository2.

FIGURE 1
www.frontiersin.org

FIGURE 1. Holistic approach to convey touch illusions by combining pseudo-haptics with haptic feedback to interact with a 3D button in VR. (A) Our wearable device provides tactile and proprioceptive feedback to convey the sensations of contact and activation, by providing tapping plus tendon electrical stimulation, and vibrotactile feedback respectively. (B) The sensation of stiffness is achieved by the use of pseudo-haptics by manipulating the distance between the rendered virtual hand from the real hand. The real hand location is added for illustration.

The modules are driven by Espressif ESP32 dual-core microprocessors clocked at 240 Mhz with 520 Kb RAM and powered by rechargeable 18650HG2 Li-Ion batteries (3.7 V). The tactile module (see Figure 2A) measures 116 mm × 65 mm (Figure 4, left), providing tapping and vibration stimuli. This module provides a tapping sensation that recreates the pressure tactile feedback in the fingertip when a button is initially touched as it stimulates the SA1 receptors for pressure (i.e., Merkel cells). A tactile tap is better than vibrotactile feedback as vibration is sensed by the Pacinian corpuscles deeper in the skin, which does not contribute to pressure sensing (Kuroki et al., 2007). Additionally, there is a TES/EMS module measuring 116 mm × 52 mm, providing the tendon stimulation (see Figure 2B). The units were located in the upper arm of the subject and connected to a wearable FingerTip (see inset in Figure 1, left) measuring 64 mm × 32 mm × 32 mm. The TES unit uses an off-the-shelf transcutaneous nerve stimulator (Schwa Medico Medizintechnik Art.-Nr. 104099-V08), and the PCB and the firmware was built upon the great openEMSstim project by Pedro Lopes3.

FIGURE 2
www.frontiersin.org

FIGURE 2. Detailed picture of the wearable modules. (A) The tactile module, used to provide tapping and vibrotactile feedback using solenoids and piezo elements respectively. (B) The electrical stimulation module, used to provide proprioceptive cues to induce the illusion of finger extension by using electrodes in the dorsal part of the hand.

In summary, the wearable modules provide redundant but congruent signals to enrich our VR setup and simulate the whole range of sensations offered by a real button. providing signals to improve the stiffness perception by adding tactile (finger) and proprioceptive (hand tendons) cues related to contact and activation of a 3D button.

3.2 Pseudo-haptic Technique

To convey the sensation of stiffness, we propose a solution compatible with the wearable device and not tied to grounded and complex haptic systems. To this end, we implemented a pseudo-haptic technique to exploit visual dominance over proprioception as a substitute for the resistive forces arising when pressing a button, altering the perceived stiffness and creating higher levels of embodiment and presence. We used pseudo-haptics with a valid range of C/D ratio manipulation that alters the perception of stiffness in a 3D button. With changes in the C/D ratio, we created a discrepancy in the displacements of the real hand (control) and the virtual hand (display) while pressing the button. As an example, to perceive a stiffer button in VR, a longer displacement in the real world was required (see Figure 1B).

3.3 Interaction Space

The experiment setup (see Figure 3A) included a table measuring 140 cm by 70 cm placed in the center of a tracking space composed of four OptiTrack Prime 13W cameras at 240 FPS and calibrated with a 3D mean error of 0.213mm, providing reliable 6DOF tracking for the participant’s index fingertip of the participant’s dominant hand. We used a PC Workstation equipped with an Intel processor i7, an Nvidia GTX1080, and an Oculus Quest HMD interconnected via Link technology, rendering the IVE using Unity 3D. The participant performed the experiment in a seated position. The coordinate systems of both the Oculus Quest and the OptiTrack system were registered by the operator at the beginning of every session as soon as the subject was seated comfortably. After registration, the subject was able to feel passive haptic feedback from the virtual table. Under the table, the participant pressed two pedals using the feet, enabling left/right answers while focusing on the hand-based experiment task. The visual stimulus presented a virtual 3D replica of the real table registered at the same position and orientation. Also, a 3D pointing hand was rendered matching the position and the orientation of the participant’s index finger. On top of the table, two virtual buttons with 18 cm diameter front and 0.15 m displacement depth were displayed (see Figure 1B). Upon collision between the button and fingertip, the virtual fingertip of the participant was always projected onto the button surface to avoid confusing visual interpenetrations, following the “God Object” constraint/based method defined by Zilles and Salisbury (1995), Harwin and Melder (2002). In order to prevent bias and confounding effects related to cross-modal references for color, size, and physical properties in general, both buttons were rendered with precisely the same material shader, mesh, size, and displacement depth (Parise, 2015). The subjects were provided with a white-noise loop sound during the whole experiment in order to suppress the auditory modality and keep the focus on visual, tactile, and proprioceptive feedback. As the experiment requires exhaustive arm movements, the subjects were asked to operate the buttons supporting the elbow on an ergonomic rubber pad to make interactions more comfortable and prevent the gorilla-arm syndrome (Lubos et al., 2016; Hansberger et al., 2017; Jang et al., 2017). Subjects could ask for breaks whenever they felt fatigued.

FIGURE 3
www.frontiersin.org

FIGURE 3. Interaction space for the experiments. (A) General setup. The participant interacts with two virtual buttons (i.e., standard and comparison) and selects the stiffer button using the corresponding pedals under the table. (B) Setup for experiment 1. The participant interacts with two virtual buttons. The left one provides stiffness with pseudo-haptics as presented in the general setup. The right one is co-located with a robotic arm exerting veridical stiffness. The users are requested to manipulate the stiffness provided by the right button using the pedals under the table until the perceived stiffness matches the left button. (C) Setup for experiment 2. The participant interacts with two virtual buttons (i.e., standard and comparison) and selects the stiffer button using the corresponding pedals under the table. Additionally, the buttons were enriched with tactile and proprioceptive feedback using the wearable modules strapped (i.e., Velcro closures) around the exterior side of the upper-right arm, and actuators instrumented in the forearm and the index finger.

4 Methods

In order to inform the design and parameters of Holitouch, we conducted two perceptual user experiments. Experiment 1 was focused on creating a convincing and accurate sensation of stiffness, we used a valid range of C/D ratios to simulate the sensation of stiffness in VR and directly compare it with veridical stiffness rendered with a robotic arm (i.e., Phantom Omni). As a result, we could calculate a specific amount of stiffness (N/m) as a function of the C/D ratio. We used multisensory integration methodologies to create a predictive model, which allows us to display plausible stiffness levels without complex, grounded-force-feedback technologies with limited workspaces (e.g., Virtuose, Phantom). In Experiment 2, we aggregated tactile actuation and brief phases of TES to introduce contact and activation cues. We used the mentioned predictive model to accordingly design the corresponding contact and activation sensations for the case of pressing a button. As a result, we evaluated our haptic-wireless wearable to integrate the sensations of contact and activation, and quantitatively characterized their relation to stiffness.

First of all, we conducted a pilot study using pseudo-haptics to find a valid range of C/D ratio manipulation that alters the perception of stiffness in a 3D button, focusing on visual feedback only as a baseline condition (see Figure 3A). 12 participants were recruited for this pilot (age: M = 26.1, SD = 3.66, zero female, four left-handed). None of the participants reported any visual impairments that could affect the results of the experiment. For this, we conducted a two-alternatives forced-choice (2AFC) experiment; the participant was asked to choose for the stiffer button using the feet pedals after exploring a standard and comparison buttons for up to 20 s. The C/D ratio was manipulated, so one of the buttons presented the standard stimulus (i.e., C/D = 1.0) and the other one gave the comparison with one of the following values: 0.2, 0.4, 0.6, 0.8, 1.2, 1.4, 1.6, 1.8. The presentation order of the buttons and the C/D ratios were randomized per trial, presenting ten repetitions per C/D ratio, completing 1,080 data samples for all the participants. The mean time per participant to complete the pilot study was about 60 min.

The results show that a range of 0.905–1.155 for C/D ratios can be applied to simulate stiffness in 3D buttons while providing a believable illusion for the users as manipulations of the real-to-virtual mapping remain unnoticed. Also, the analysis reported valid ranges for the Point of Subjective Equality (PSE) and Just Noticeable Difference (JND). The results of this pilot study confirm a working apparatus and provide a validated C/D ratio range for the following experiments to contrast the effects of the Holitouch technique on perception thresholds and subjective evaluations.

4.1 Experiment 1: Model of Stiffness and Multi-Sensory Integration

In this experiment, we compared real stiffness provided by a physical button with the stiffness simulated via the pseudo-haptics implementation used in the pilot study. Based on real proprioceptive feedback provided by a robotic arm, we use multisensory integration theory (Ernst and Banks, 2002) to create a model to predict the perceived stiffness for a specific C/D ratio.

4.1.1 Participants

12 Participants (age: M = 24.8, SD = 3.53, two female, two left-handed) participated in this experiment, mainly students and employees from our computer science department. According to the pre-questionnaire, none of the participants reported any visual impairments that could affect the results of our experiment.

4.1.2 Apparatus

The apparatus for this experiment builds upon the general setup Figure 3A), adding a robotic arm (i.e., Panthom Omni device) to provide force feedback and dynamically simulate stiffness for virtual button counterparts (see Figure 3B). The right virtual button was co-located with the Omni’s effector tip, so the center of the button surface provided corresponding stiffness whenever the user touched the virtual button with the virtual hand providing real stiffness perceivable via the user’s real index finger. The participants were able to manipulate the stiffness provided by the wight button, increasing or decreasing the stiffness by 0.1 N/m using the right and left pedals, respectively.

4.1.3 Procedure

We used the method of adjustments, so the participants used the right hand on the Omni’s effector tip to operate the right virtual button and the left hand to operate the left virtual button, which provided stiffness by modifying the C/D ratio with one of the following values: 0.2, 0.4, 0.6, 0.8, 1.2, 1.4, 1.6, 1.8. Then, in every trial, the participants were asked to adjust the stiffness of the right button using the pedals until they perceived a match with the stiffness of the left button; they could adjust it for up to 30 s, followed by a confirmation to record the data using gaze pointing on a virtual pop-up display, and the start of the next trial.

We used psychophysical methodologies (Samad et al., 2019) to compare stiffness between pseudo-haptics (i.e., C/D ratio manipulation) and real force feedback (i.e., Phantom device), using the Forced Fusion strategy (Helbig and Ernst, 2007) of multisensory integration (Ernst and Banks, 2002) to create a predictive model for perceived stiffness in VR given a C/D ratio. To do this, we solved the integration calculations for the proprioceptive signal (i.e., real force feedback on the right button) and the visual signal (i.e., pseudo-haptics on the left button).

First, we assume that subjects were asked to use the virtual index finger to press completely the virtual buttons, with a visual displacement of dvisual = 0.15 m, moving the real index finger by dproprioceptive=0.15CDRm, where CDR is the C/D ratio, so we can define the proprioceptive displacement as dproprioceptive. We can also define sproprioceptive as the proprioceptive stiffness felt in the finger. Now, we can measure the force applied by the button to the fingertip based on the proprioceptive distance dproprioceptive (Eq. 2).

F=dproprioceptivesproprioceptive(1)
dperceived=αdproprioceptive+βdvisual(2)
sperceived=Fdperceived(3)
sperceived=dproprioceptivesproprioceptiveαdproprioceptive+βdvisual(4)
sperceived=0.15CDRsproprioceptiveα0.15CDR+β0.15=sproprioceptiveα+βCDR(5)

dproprioceptive could differ from the visual displacement (dvisual) when CDR is manipulated, but the multisensory signals are integrated with (α+β=1) as the combination for proprioceptive and visual weights. Thus, the perceived displacement (dperceived) is a combination of the two cues (Eq. 3). Perceived stiffness can also be defined in terms of dproprioceptive (Eqs 4, 5), and substituting the known terms, we get the equation for the perceived stiffness (Eq. 6). Once the method of adjustments is completed, we can use the matched data for the veridical stiffness felt on the right index finger (sproprioceptive) and the corresponding CDR values used to render stiffness for the left finger (see gray dots in Figure 4) to get the missing proprioceptive and visual weights (i.e., α and β). Eq. 6 was fitted with the proprioceptive stiffness data from the experiment to get the integration weights. Figure 4 shows the perceived stiffness according to the experimental data and the red fitting curve of the integration model (R = 0.959462).

FIGURE 4
www.frontiersin.org

FIGURE 4. Data from experiment 1 on perceived stiffness by C/D ratio (gray dots), and the corresponding fitting with Eq. 5 by C/D ratio (R = 0.959462).

As a result, we got the predictive model equation that enables the calculation of perceived stiffness in VR buttons for a given C/D ratio (Eq. 6). Besides, based on the psychometric results from the pilot study, we can use the identified range of C/D ratios (i.e., CDR = (0.905–1.155)), and use the equation to get a range of perceptually congruent stiffness values going from 0.524 N/m to 0.568 N/m, which are valid stiffness values for finger forces (Georgiou, 2014).

Stiffness(N/m)=sperceived=0.541.32(0.32CDR),(6)

The unit of stiffness is Newtons per meter (N/m), it delivers the counterforce provided by the button as a function of distance, and CDR is the used C/D ratio.

4.2 Experiment 2: Enriching Stiffness With Tactile and Proprioceptive Feedback

The second experiment builds on the results from experiments 1, reusing the evaluated setup to provide the sensation of stiffness. However, when pressing a real button, visual, tactile, and proprioceptive feedback are experienced simultaneously. We want to cover this gap by providing tactile and proprioceptive feedback to convey the missing sensations for contact and activation. Table 1 shows the sensation we want to convey and the feedback we are providing. The sensation of stiffness (S) is provided by C/D ratio manipulation using the results from experiments 1. This experiment delivered a solid apparatus to quantify the perceived stiffness within valid perceptual limits. Contact (C) sensation is provided by tactile and proprioceptive signals triggered whenever the subject fingertip is touching a virtual button. Finally, the sensation of activation (A) is provided by vibration according to the inner mechanisms of a button to deliver dynamic cues while the subject finger is pressing/displacing and finally activating the virtual button. As a result, we evaluated our haptic-wireless wearable to integrate the sensations of contact and activation, and quantitatively characterized their relation to stiffness.In our experiment, we kept constant the activation and contact sensations, and only varied the C/D ratio to reduce the complexity and the number of trials. By this, we ensured a feasible experiment and prevented fatigue effects that would lower the data quality. Our holistic approach, integrates our findings from experiment 1 with carefully chosen and already validated stimuli from previous research, ensuring that the presented parameters for the activation and contact sensations provide meaningful and timely feedback. In the same way, we discarded sensations like temperature and texture (already evaluated by (Kammermeier et al., 2004)), as they are not essential nor relevant for 3D buttons.

TABLE 1
www.frontiersin.org

TABLE 1. Multisensory feedback and the corresponding sensations conveyed and evaluated in experiment 2

4.2.1 Participants

Eight Participants (age: M = 29.3, SD = 4.3, four female, two left-handed) participated in this experiment, mainly students and employees from our computer science department. According to the pre-questionnaire, none of the participants reported any visual impairments that could affect our experiment’s results.

4.2.2 Approach

In common IVE scenarios, whenever a user approaches and collides with virtual objects, the visual feedback is correct but only a fraction of the haptic sensations are rendered. Still, the fingertip tactile sensation of the collision and also the proprioceptive sensation of the movement being interrupted are lost, producing large sensory conflicts. Such conflicts are crucial for human-computer interaction in IVEs as they can induce breaks of presence, reduce task performance, require higher cognitive efforts during the multisensory integration process, and finally degrades the overall user experience in the IVE. The goal of our research is to provide a holistic touch sensation by providing tactile and proprioceptive feedback together with the visual illusion of touch.

Figure 5 describes the modules and corresponding technology that convey the different sensations step-by-step as follows: Starting with the VR user approaching and finally touching a virtual button with the index finger, causing our system to provide tactile (tapping) and kinesthetic (finger extension) feedback to convey the illusion of contact. While pushing the button, the pseudo-haptics induce a sensation of stiffness, and simultaneously, the changing frequency of vibrotactile feedback indicates the activation (i.e., during press/release a higher frequency is generally associated with higher stiffness). Finally, after the button is released, all feedback is disabled. The same figure provides details regarding intensities, frequencies, duration and forces involved in the feedback provided for the different sensations.

FIGURE 5
www.frontiersin.org

FIGURE 5. Step by step process of approaching, pressing, and releasing a virtual button. Additionally, the corresponding technology that convey the different sensations involved in such 3DUI interactions.

The activation sensation is rendered by the use of frequency/displacement curves (Kim and Lee, 2013; Liao et al., 2018). The curve can be easily adapted to define different buttons behaviors for press/release movements. To this end, the curves define dynamic changes of frequency according to the button displacement. Additionally, we could use the curves to simultaneously change C/D ratios for pseudo-haptics to render variable stiffness.

In this work, we use an approach to simulate impact decomposing the resulting sensations in tactile and proprioceptive feedback, as explored before (Lopes et al., 2015). However, we use TES (Miyamoto et al., 2015) because it is more compatible with the sensorimotor frameworks that make VR illusions possible (Gonzalez-Franco and Lanier, 2017). In our case, the sense of agency (i.e., volition) helps the user to internalize the VR experience as real, whenever the user is the initiator of the action (i.e., efference; ballistic-correction arm movement to press the button) and is artificially provided with afferent feedback from the Golgi tendon on the extensor indicis muscle. Besides, TES is beneficial to convey contact sensations while avoiding unwanted small movements (i.e., Heisenberg effect (Argelaguet and Andujar, 2013)) during 3D interaction. Such movements can deteriorate the quality of the experience because an abrupt hand movement would be noticed immediately as the C/D ratio manipulation drives the user’s stiffness perception. This is also sustained by the perception of stiffness as the result of cutaneous and proprioceptive integration provided by the kinesthetic subsystem and the efference copy (Metzger and Drewing, 2015).

4.2.3 Latency Compensation

Non-simultaneous multimodal feedback represents problems for motor control, producing temporal asynchrony and inadequate cue integration. For this reason, synchronization of the feedback is essential to deliver a compelling and plausible experience; time gaps between feedback signals could require time prediction of triggering events to compensate and finally convey all the multimodal sensations in unison for the user. The Minimum Jerk (MJ) movement model (Flash and Hogan, 1985), predicts the ballistic phase in arm movements, assuming that the primary goal of motor coordination is to produce the smoothest possible movement of the hand in extracorporal space. The model has been tested in IVEs with teleoperation purposes, showing that the model can deal with delays up to 100 ms (Bratt et al., 2007). The model can also be used locally in IVEs to compensate for delays and latency, with an average error of 290 ms with hand redirection and 30 ms with normal hand reach (Gonzalez et al., 2019). A user could reliably detect the asynchrony if haptic feedback were presented less than 50 ms after visual stimuli (Di Luca and Mahnan, 2019).

Prior solutions involved the use of larger bounding volumes (i.e., 25%) for collision detection, delivering early feedback (i.e., 30 ms) to compensate for the EMS latency (Lopes et al., 2015). Regarding our use case, relevant research provides some guidelines. Physical contact with a button occurs in about 100 ms (Kim et al., 2013); however, it is too brief for real-time corrections (Gomi, 2008). TES and EMS are prone to latencies involving the propagation of the motor signals in limbs; for this reason, using multimodal combinations of TES and electrotactile feedback requires an early activation of TES, starting 25 ms before the tactile signal to guarantee the perception of simultaneity (Miyamoto et al., 2015). We implemented the MJ model to reduce the latency of our TES unit using data from our real-time velocity profile module (Ariza et al., 2018). Thus, the TES stimulation is not triggered by the button collider but by the prediction algorithm based on the velocity profile of the ballistic approach and the MJ output. As a result, we can calibrate the model at the beginning of the experiment session and reduce the latency to be compatible with the perceptual limits described above.

4.2.4 Procedure

Table 2 shows the conditions of the experiment. We want to test how the additions of modalities affect the perceptual limits, taking the stiffness-only sensation as our control condition. In this experiment, subjective user acceptance was also evaluated.

TABLE 2
www.frontiersin.org

TABLE 2. Conditions and sensations conveyed for experiment 2.

The apparatus for this experiment builds upon the general setup. We incorporated a wearable and wireless haptic device (see Figure 1A and Figure 3C). In the beginning, the TES analog stimulator is calibrated for every participant to guarantee that every subject is comfortable with the current levels applied on the electrodes. TES calibration was performed per subject and lasted 5 min; the extensor indicis muscle was contracted ten times to guarantee a reliable and constant level of stimulus. Electrodes were located along with the hand and wrist, following the oblong structure of the muscles. The anode was located on the tendon and the cathode on the muscle spindle. The stimulus was intended to produce a sensation of force traveling in the direction opposite the side to which the electrode was affixed so that the subject could feel a subtle index-finger extension. During the calibration process, participants practiced the reaching movement to press the virtual button to guarantee homogeneous velocities and reliable prediction of the collision time to compensate for latencies.

We conducted a two-alternatives forced-choice (2AFC) experiment again; the participant was asked to choose the stiffer button using the foot pedals after exploring a standard and comparison buttons for up to 20 s. Both buttons provided the same sensations dictated by the current condition (see Table 2). Only the C/D ratio was manipulated, so one of the buttons presented the standard stimulus (i.e., C/D = 1.0). The other one gave the comparison with one of the following values: 0.7, 0.8, 0.9, 1.1, 1.2, 1.3. The presentation order of the buttons and the C/D ratios were randomized per trial.

We collected subjective data from a questionnaire on user acceptance using a 7-point Likert scale (1, disagree, 7 agree) for all the questions. The questionnaire was answered after every condition (i.e., 4 times per subject), whenever the user explored all the trials (i.e., C/D ratios). The presentation order was counterbalanced.

We measured the sensations conveyed:

• Q1 I felt stiffness in the buttons.

• Q2 I felt when I touched the buttons.

• Q3 I felt I was pressing the buttons.

• Q4 I had the feeling of manipulating real buttons.

And limb ownership and realism:

• Q5 The virtual hands appeared in the same location as my hands.

• Q6 The virtual hands seemed to belong to my body.

• Q7 The buttons that I pressed were the same buttons that I saw.

• Q8 The buttons that I saw were in the same location as the buttons I felt.

5 Results

This section presents the results for the experiment conditions in Table 2, regarding six C/D ratios, and six repetitions to get 144 data samples per participant and 1,152 data samples for the whole experiment.

First, we fitted again the data regarding C/D ratios and the probabilities of perceiving stiffer buttons for the different conditions (see Figure 6). Condition SA presented a PSE of 1.0328, a JND of 0.1186, and 85.07% correct responses for the stiffer button. Condition SC gave a PSE of 1.0624, a JND of 0.1032, and 85.42% correct responses for the stiffer button. Finally, the condition SCA presented a PSE of 1.0155, a JND of 0.0716, and 93.75% of correct decisions for the stiffer button. The condition S, providing only pseudo-haptics for the stiffness condition, presented 70.49% correct choices for the stiffer button.

FIGURE 6
www.frontiersin.org

FIGURE 6. C/D ratio vs. stiffer responses for the tested conditions (see Table 2): (A) stiffness only, (B) stiffness + activation, (C) stiffness + contact, and (D) stiffness + contact + activation.

Knowledge about the range of C/D ratios, which can be applied without users noticing the discrepancy, is essential to provide a plausible God Object constraint for 3DUI elements (Zilles and Salisbury, 1995; Harwin and Melder, 2002). The C/D ratios found for the conditions SA, SC, and SCA are in line with prior findings on hand-redirection gains in the depth axis (Zenner and Krüger, 2019), which found that a range of 0.88–1.07 (−6.18–13.75%) can go unnoticed.

The mean JND differences for the conditions S (M = 0.2476, SD = 0.0537), SA (M = 0.1186, SD = 0.0300), SC (M = 0.1032, SD = 0.0545), and SCA (M = 0.0716, SD = 0.0241) were statistically significant according to a Friedman test (χ2(3, N = 8) = 7.95, p = 0.0471, W = 0.331). Following post-hoc pairwise comparisons, by using Wilcoxon signed-rank tests with Bonferroni correction, indicated that our holistic approach (condition SCA) presented significantly lower JNDs than providing stiffness only (condition S, p = 0.028) and providing stiffness + activation (condition SA, p = 0.044).

Likewise, we checked for differences in the percentage of correct responses (PCR) among conditions, finding a significant main effect according to a Friedman test (χ2(3, N = 8) = 17.2, p = 0.0006, W = 0.718). Wilcoxon signed-rank pairwise comparisons with Bonferroni correction, indicated that our holistic approach (condition SCA, M = 93.75%, SD = 6.92, p = 0.015) and providing stiffness + activation (condition SA, M = 85.07%, SD = 4.18, p = 0.027) have significantly higher PCRs than providing stiffness only (condition S, M = 70.49%, SD = 12.3).

Second, we processed the data from the subjective questionnaires (see Figure 7 and Table 3), performing the omnibus test with an ordinal logistic regression, and then a post-hoc test by simultaneous pairwise comparisons using Tukey’s HSD correction. The answers for the questions evaluating the sensations conveyed show significantly higher scores for the condition SCA over S, with overall higher scores for user acceptance on all sensations. Condition SC presented high scores for the sensation of contact, and the condition SA showed high scores for the sensation of activation, compared with condition S only. The scores for the subjective acceptance of limb ownership and realism were mainly high for all conditions, with significant differences related to visuo-haptic mismatches in the condition S compared to SC, SA, and SCA.

FIGURE 7
www.frontiersin.org

FIGURE 7. Results for the subjective questionnaires.

TABLE 3
www.frontiersin.org

TABLE 3. Descriptive statistics and omnibus/post-hoc tests results for the subjective questionnaires. The condition acronyms stand the sensations conveyed; Stiffness, Contact, and Activation (Significance codes: 0 ∗∗∗ 0.001 ∗∗ 0.01 0.05).

Descriptive statistics and omnibus/post-hoc tests results for the subjective questionnaires are provided in Table 3.

6 Discussion

With this work, we wanted to go beyond providing abstract vibrotactile feedback in such a way that haptics could realistically convey sensations of interacting with virtual user interface elements (i.e., virtual buttons in our use case) that posses physical properties.

The conducted experiments examine the effects of conveying sensations for contact, activation, and stiffness sensations while pressing virtual buttons, building upon recent research (Pezent et al., 2019; Samad et al., 2019). The choice of sensations unifies premises in related research, looking for rich 3DUI elements supporting useful affordances. Hence, the user knows what can be done, whether the interaction element is working, and when the action/task is complete.

Sensitivity to Stiffness

     Besides, we found that the PSE, giving the point where a physical stiffness is estimated of equal size as a virtual counterpart, was closer to the identity when other sensations were added. The JND decreased, thus increasing the sensitivity to stiffness changes when sensations of contact and activation were added. These findings evidence the advantages of multimodal feedback and dispense guidelines for 3DUI elements involving designs with multimodal feedback. Our proposed holistic rendering approach results in an increased sensitivity of the user to the haptic qualities of the buttons (i.e., stiffness of the 3D button). In other words, by showing that the JNDs are significantly smaller (while the percentages of correct responses were significantly higher) when multiple sensations were involved (SCA), we could quantify and prove that users can distinguish more (and more fine-grained) levels of virtual button stiffness. This result is of importance also because it shows that more distinguishable levels of feedback can be produced within the same range of C/D ratio manipulation when multiple modalities are involved. Furthermore, the subjective results also presented significant scores in favor of our holistic approach.

Sensations Conveyed

     The conditions providing multimodal feedback including contact information, presented significantly higher scores when participants were evaluated about the feeling of stiffness (Q1) and touching a button (Q2, with even higher differences with stiffness only), presenting a trend in favor of contact over activation. Q3 presented an opposite trend, in this case, the sensation of pressing/releasing a button was supported by modulated vibrotactile feedback, only for the conditions SA and SCA, but not in the condition SC (i.e., without activation cues). In SC, only tapping sensations to convey contact were provided. As a result, the users did not feel like pressing/releasing a button as strongly as they felt in SA and SCA, due to missing activation cues (coming from the frequency-based displacement curve) being applied in the index fingertip (see activation step in Figure 5). Regarding cases where an integration of all the proposed modalities is impractical, our results also highlight that the addition of contact-only (SC) or activation-only (SA) feedback can already enhance the experience of interacting with virtual buttons compared to just using pseudo-haptics, for the subjective feeling of touching and pressing 3D buttons (Q2, and Q3 respectively).

Limb Ownership and Realism

     The scores for the subjective acceptance in this category were mainly high for all conditions but stiffness only (condition S, providing visuo-haptic mismatches only). In this case, the subjects reported significantly higher scores (equally for all the multimodal conditions: SC, SA, and SCA) when asked about the feeling of manipulating real buttons (Q4) and proprioceptive, as well as, visual realism of the 3D buttons (Q7 and Q8 respectively). As a 3DUI design guideline, the inclusion of complimentary modalities to convey sensations of contact, activation, or both, could improve the overall sense of presence and realism while interacting with 3D buttons.

From a high-level perspective, our approach could be integrated in future consumer VR systems as it is wearable and the utilized actuation technology is rather low-cost compared to grounded active haptic robots traditionally used to convey stiffness of 3DUI elements, which have severe workspace limitations and involve complex setups. As such, our proposed approach can be seen as an intermediate solution (i.e., in between the poles of purely virtual pseudo-haptics and purely physical active haptics).

The proposed technology and multimodal feedback could be used in VR and AR environments, enabling the user with instinctual interactions. As we mentioned, we aim to supply interactive elements with the right affordances to be quickly learned and used with different interaction models. For example, the user could interact with virtual hand representations operating controllers or hand-free natural interaction. These models could also be combined with a gaze and commit technique using voice commands for confirmations, enabling a seamless and performant interaction environment.

We also believe that the combination of feedback tackles the problem of under/overestimation while pressing a virtual button, in comparison with prior research, which reported depth perception problems (Faeth and Harding, 2014; Ariza et al., 2018). This was enabled by the aggregation of the sensation of button hysteresis in the activation sensation, as we implemented the mechanism to render displacement/force curves, replacing the force with dynamic changes of frequency and amplitude on the piezo element used in the conditions with activation (i.e., SA and SCA).

Our holistic approach still has room for improvements. The use of elongated electrodes (Takahashi et al., 2021) would improve our TES implementation with better-targeted stimulation to tendons and hence leveraging better proprioceptive contact sensations. Likewise, our TES device is still cumbersome in terms of requiring per-participant calibration and would be benefited by slimmer and more lightweight electronic designs, ideally integrated in a wristband for better ergonomics and comfort. However, our TES prototype could even modulate the stimulation intensity, enabling the render of dynamic tendon contractions for richer contact sensations. As a result, the current design, which modulates stiffness only, could be improved by combining dynamic activation curves (also currently supported). A future experiment would evaluate dynamic parameters for contact (i.e., tapping and TES), activation (i.e., piezo vibrotactile feedback), and stiffness (by using curves to simultaneously change C/D ratios to render variable stiffness) altogether.

As a conclusion, we introduced Holitouch, a novel approach leveraging different techniques to improve the haptic realism of simple 3D user interface elements such as 3D buttons. Holitouch integrates wearable technology that combines different types of haptic feedback to provide tapping and vibrotactile sensations with a pseudo-haptic method to simulate stiffness by manipulating the C/D ratio. We reported the results of 2 studies combining pseudo-haptics, tactile, and proprioceptive feedback for 3DUIs, providing guidelines to properly convey contact, stiffness, and activation in 3DUIs based on user acceptance and performance metrics. Besides, we used a multisensory integration model to facilitate an equation to predict perceived stiffness given a specific C/D ratio.

We provided objective and subjective evidence on how combinations of proprioceptive, pseudo, and tactile feedback improve the conveyed sensations of 3D buttons. Providing these sensations by using congruent feedback as redundant multisensory information enabled our approach to create a plausible illusion of touch, delivering a holistic feeling when interacting with virtual 3D buttons. As a result, the use of multimodal feedback enhanced the overall perception of stiffness, while maintaining good levels for the sense of body ownership and presence, as well as positive user acceptance.

Future experiments could embed our holistic approach into production applications involving 3D buttons (e.g., industrial training, cockpit simulations, surgical simulations) to evaluate its effects on embodiment, agency, and presence, as well as analyzing tolerances on the reported perceptual thresholds (i.e., when additional sensations are provided). Additionally, we are extending the device to support multiple fingers (e.g., grasping), palm stimulation, and bi-manual interaction. As future work, we are also improving our device with a new set of haptic sensations relying on audio-based tactile feedback in order to support high-definition effects and the development of new 3DUI elements. Current testing is ongoing with novel diegetic user interfaces for VR. Additional ideas for future work would involve multi-finger holistic touch feedback and integration of the presented approach into tracking gloves to enable improved 3DUI interaction experiences like typing in VR/AR environments.

Data Availability Statement

The raw data supporting the conclusion of this article will be made available by the authors, without undue reservation.

Ethics Statement

The studies involving human participants were reviewed and approved by Ethics Commission of the Department of Informatics (Faculty of Mathematics, Informatics and Natural Sciences of Universität Hamburg). The patients/participants provided their written informed consent to participate in this study. Written informed consent was obtained from the individual(s) for the publication of any potentially identifiable images or data included in this article.

Author Contributions

The sections dedicated to the materials, methods, and results were mainly contributed by OA and AZ. All authors have contributed to the writing of this paper.

Funding

This research was funded by the German Research Foundation (DFG), under the project number 450247716.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Acknowledgments

The authors would like to thank Lucie Kruse for her valuable help and assistance during the implementation and the experiments.

Footnotes

1https://www.idtechex.com/en/research-report/haptics-2020-2030-technologies-markets-and-players/684

2https://github.com/tlon-citizen/HapBand

3https://github.com/PedroLopes/openEMSstim

References

Abtahi, P., and Follmer, S. (2018). “Visuo-haptic Illusions for Improving the Perceived Performance of Shape Displays,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 1–13. doi:10.1145/3173574.3173724

CrossRef Full Text | Google Scholar

Achibet, M., Girard, A., Talvas, A., Marchal, M., and Lécuyer, A. (2015). “Elastic-arm: Human-Scale Passive Haptic Feedback for Augmenting Interaction and Perception in Virtual Environments,” in 2015 IEEE Virtual Reality (VR) (Arles, Provence, France: IEEE), 63–68. doi:10.1109/vr.2015.7223325

CrossRef Full Text | Google Scholar

Achibet, M., Le Gouis, B., Marchal, M., Léziart, P., Argelaguet, F., Girard, A., et al. (2017). “Flexifingers: Multi-Finger Interaction in Vr Combining Passive Haptics and Pseudo-haptics,” in 2017 IEEE Symposium on 3D User Interfaces (3DUI) (IEEE), 103–106. doi:10.1109/3dui.2017.7893325

CrossRef Full Text | Google Scholar

Achibet, M., Marchal, M., Argelaguet, F., and Lécuyer, A. (2014). “The Virtual Mitten: A Novel Interaction Paradigm for Visuo-Haptic Manipulation of Objects Using Grip Force,” in 2014 IEEE Symposium on 3D User Interfaces (3DUI) (IEEE), 59–66. doi:10.1109/3dui.2014.6798843

CrossRef Full Text | Google Scholar

Adilkhanov, A., Yelenov, A., Reddy, R. S., Terekhov, A., and Kappassov, Z. (2020). Vibero: Vibrotactile Stiffness Perception Interface for Virtual Reality. IEEE Robot. Autom. Lett. 5, 2785–2792. doi:10.1109/LRA.2020.2972793

CrossRef Full Text | Google Scholar

Argelaguet, F., and Andujar, C. (2013). A Survey of 3d Object Selection Techniques for Virtual Environments. Comput. Graph. 37, 121–136. doi:10.1016/j.cag.2012.12.003

CrossRef Full Text | Google Scholar

Argelaguet, F., Hoyet, L., Trico, M., and Lecuyer, A. (2016). “The Role of Interaction in Virtual Embodiment: Effects of the Virtual Hand Representation,” in 2016 IEEE Virtual Reality (VR) (Greenville, SC: IEEE), 3–10. doi:10.1109/vr.2016.7504682

CrossRef Full Text | Google Scholar

Ariza, O., Bruder, G., Katzakis, N., and Steinicke, F. (2018). “Analysis of Proximity-Based Multimodal Feedback for 3d Selection in Immersive Virtual Environments,” in 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (Reutlingen, Germany: VR), 327–334. doi:10.1109/VR.2018.8446317

CrossRef Full Text | Google Scholar

Azmandian, M., Hancock, M., Benko, H., Ofek, E., and Wilson, A. D. (2016). “Haptic Retargeting,” in Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 1968–1979. doi:10.1145/2858036.2858226

CrossRef Full Text | Google Scholar

Berger, C. C., Gonzalez-Franco, M., Ofek, E., and Hinckley, K. (2018). The Uncanny Valley of Haptics. Sci. Robot. 3, eaar7010. doi:10.1126/scirobotics.aar7010

PubMed Abstract | CrossRef Full Text | Google Scholar

Bergström, J., Mottelson, A., and Knibbe, J. (2019). “Resized Grasping in VR,” in UIST ’19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New York, NY: Association for Computing Machinery), 1175–1183. doi:10.1145/3332165.3347939

CrossRef Full Text | Google Scholar

Bermejo, C., Lee, L. H., Chojecki, P., Przewozny, D., and Hui, P. (2021). Exploring Button Designs for Mid-air Interaction in Virtual Reality: A Hexa-Metric Evaluation of Key Representations and Multi-Modal Cues. Proc. ACM Hum.-Comput. Interact. 5, 1–26. doi:10.1145/3457141

CrossRef Full Text | Google Scholar

Biocca, F., Kim, J., and Choi, Y. (2001). Visual Touch in Virtual Environments: An Exploratory Study of Presence, Multimodal Interfaces, and Cross-Modal Sensory Illusions. Presence Teleoperators Virtual Environ. 10, 247–265. doi:10.1162/105474601300343595

CrossRef Full Text | Google Scholar

Bowman, D. A., Frohlich, B., Kitamura, Y., and Stuerzlinger, W. (2005). “New Directions in 3d User Interfaces,” in Proceedings of the 2005 IEEE Conference 2005 on Virtual Reality (IEEE Computer Society), 312. doi:10.1109/VR.2005.58

CrossRef Full Text | Google Scholar

Bratt, M., Smith, C., and Christensen, H. I. (2007). “Minimum Jerk Based Prediction of User Actions for a Ball Catching Task,” in 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE), 2710–2716. doi:10.1109/iros.2007.4398989

CrossRef Full Text | Google Scholar

Cheng, L.-P., Ofek, E., Holz, C., Benko, H., and Wilson, A. D. (2017). “Sparse Haptic Proxy,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 3718–3728. doi:10.1145/3025453.3025753

CrossRef Full Text | Google Scholar

Choi, I., Ofek, E., Benko, H., Sinclair, M., and Holz, C. (2018). “Claw,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 1–13. doi:10.1145/3173574.3174228

CrossRef Full Text | Google Scholar

Cornelio Martinez, P. I., De Pirro, S., Vi, C. T., and Subramanian, S. (2017). “Agency in Mid-air Interfaces,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 2426–2439. doi:10.1145/3025453.3025457

CrossRef Full Text | Google Scholar

[Dataset] Kim, W., and Xiong, S. (2021). Pseudo-haptic Button for Improving User Experience of Mid-air Interaction in Vr. arXiv. Available at: https://arxiv.org/abs/2112.11007.

Google Scholar

Di Luca, M., and Mahnan, A. (2019). “Perceptual Limits of Visual-Haptic Simultaneity in Virtual Reality Interactions,” in 2019 IEEE World Haptics Conference, WHC 2019 (Tokyo, Japan: IEEE), 67–72. doi:10.1109/WHC.2019.8816173

CrossRef Full Text | Google Scholar

Dominjon, L., Lecuyer, A., Burkhardt, J., Richard, P., and Richir, S. (2005). “Influence of Color/display Ratio on the Perception of Mass of Manipulated Objects in Virtual Environments,” in Proceedings of the 2005 IEEE Conference 2005 on Virtual Reality (IEEE Computer Society), 19–25. doi:10.1109/VR.2005.49

CrossRef Full Text | Google Scholar

Duente, T., Pfeiffer, M., and Rohs, M. (2017). “Zap++,” in Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services (New York, NY: Association for Computing Machinery). doi:10.1145/3098279.3098546

CrossRef Full Text | Google Scholar

Ernst, M. O., and Banks, M. S. (2002). Humans Integrate Visual and Haptic Information in a Statistically Optimal Fashion. Nature 415, 429–433. doi:10.1038/415429a

PubMed Abstract | CrossRef Full Text | Google Scholar

Faeth, A., and Harding, C. (2014). Emergent Effects in Multimodal Feedback from Virtual Buttons. ACM Trans. Comput.-Hum. Interact. 21, 1–23. doi:10.1145/2535923

CrossRef Full Text | Google Scholar

Fang, C., Zhang, Y., Dworman, M., and Harrison, C. (2020). “Wireality: Enabling Complex Tangible Geometries in Virtual Reality with Worn Multi-String Haptics,” in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 1–10. doi:10.1145/3313831.3376470

CrossRef Full Text | Google Scholar

Flash, T., and Hogan, N. (1985). The Coordination of Arm Movements: an Experimentally Confirmed Mathematical Model. J. Neurosci. 5, 1688–1703. doi:10.1523/JNEUROSCI.05-07-01688.1985

PubMed Abstract | CrossRef Full Text | Google Scholar

Georgiou, T. (2014). Human Haptic Perception in Virtual Environments: An Investigation of the Interrelationship between Physical Stiffness and Perceived Roughness. York, England: University of York.

Google Scholar

Gomi, H. (2008). Implicit Online Corrections of Reaching Movements. Curr. Opin. Neurobiol. 18, 558–564. doi:10.1016/j.conb.2008.11.002

PubMed Abstract | CrossRef Full Text | Google Scholar

Goncalves, G., Melo, M., Vasconcelos-Raposo, J., and Bessa, M. (2020). Impact of Different Sensory Stimuli on Presence in Credible Virtual Environments. IEEE Trans. Vis. Comput. Graph. 26, 3231–3240. doi:10.1109/TVCG.2019.2926978

PubMed Abstract | CrossRef Full Text | Google Scholar

Gonzalez, E. J., Abtahi, P., and Follmer, S. (2019). “Evaluating the Minimum Jerk Motion Model for Redirected Reach in Virtual Reality,” in The Adjunct Publication of the 32nd Annual ACM Symposium on User Interface Software and Technology (New York, NY: Association for Computing Machinery), 4–6. doi:10.1145/3332167.3357096

CrossRef Full Text | Google Scholar

Gonzalez-Franco, M., and Lanier, J. (2017). Model of Illusions and Virtual Reality. Front. Psychol. 8, 1125. doi:10.3389/fpsyg.2017.01125

PubMed Abstract | CrossRef Full Text | Google Scholar

Hansberger, J., Peng, C., Mathis, S., Shanthakumar, V. A., Meacham, S., Cao, L., et al. (2017). “Dispelling the gorilla Arm Syndrome: The Viability of Prolonged Gesture Interactions,” in Virtual, Augmented and Mixed Reality (Cham, Zug, Switzerland: Springer International Publishing). doi:10.1007/978-3-319-57987-0_41

CrossRef Full Text | Google Scholar

Harwin, W., and Melder, N. (2002). “Improved Haptic Rendering for Multi-Finger Manipulation Using Friction Cone Based God-Objects,” in Proceedings of Eurohaptics Conference (Edinburgh, Scotland: Edinburgh College of Art), 82–85.

Google Scholar

Helbig, H. B., and Ernst, M. O. (2007). Optimal Integration of Shape Information from Vision and Touch. Exp. Brain Res. 179, 595–606. doi:10.1007/s00221-006-0814-y

PubMed Abstract | CrossRef Full Text | Google Scholar

Jacob, R. J. K., Sibert, L. E., McFarlane, D. C., and Mullen, M. P. (1994). Integrality and Separability of Input Devices. ACM Trans. Comput.-Hum. Interact. 1, 3–26. doi:10.1145/174630.174631

CrossRef Full Text | Google Scholar

Jang, S., Stuerzlinger, W., Ambike, S., and Ramani, K. (2017). “Modeling Cumulative Arm Fatigue in Mid-air Interaction Based on Perceived Exertion and Kinetics of Arm Motion,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 3328–3339. doi:10.1145/3025453.3025523

CrossRef Full Text | Google Scholar

Kammermeier, P., Kron, A., Hoogen, J., and Schmidt, G. (2004). Display of Holistic Haptic Sensations by Combined Tactile and Kinesthetic Feedback. Presence Teleoperators Virtual Environ. 13, 1–15. doi:10.1162/105474604774048199

CrossRef Full Text | Google Scholar

Khamis, M., Schuster, N., George, C., and Pfeiffer, M. (2019). “Electrocutscenes: Realistic Haptic Feedback in Cutscenes of Virtual Reality Games Using Electric Muscle Stimulation,” in 25th ACM Symposium on Virtual Reality Software and Technology (New York, NY: Association for Computing Machinery). doi:10.1145/3359996.3364250

CrossRef Full Text | Google Scholar

Kim, H., Kim, M., and Lee, W. (2016). “HapThimble,” in Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 3694–3705. doi:10.1145/2858036.2858196

CrossRef Full Text | Google Scholar

Kim, J., Kim, S., and Lee, J. (2022). The Effect of Multisensory Pseudo-haptic Feedback on Perception of Virtual Weight. IEEE Access 10, 5129–5140. doi:10.1109/ACCESS.2022.3140438

CrossRef Full Text | Google Scholar

Kim, S., Lee, B., and Oulasvirta, A. (2018). “Impact Activation Improves Rapid Button Pressing,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 1–8. doi:10.1145/3173574.3174145

CrossRef Full Text | Google Scholar

Kim, S., and Lee, G. (2013). “Haptic Feedback Design for a Virtual Button along Force-Displacement Curves,” in Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (New York, NY: Association for Computing Machinery), 91–96. doi:10.1145/2501988.2502041

CrossRef Full Text | Google Scholar

Kim, S., Son, J., Lee, G., Kim, H., and Lee, W. (2013). Tapboard: Making a Touch Screen Keyboard More Touchable. New York, NY: Association for Computing Machinery, 553–562. doi:10.1145/2470654.2470733

CrossRef Full Text | Google Scholar

Kohli, L. (2013). Redirected Touching. IEEE. Ph.D. thesis.

Google Scholar

Kuroki, S., Kajimoto, H., Nii, H., Kawakami, N., and Tachi, S. (2007). “Proposal for Tactile Sense Presentation that Combines Electrical and Mechanical Stimulus,” in Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WHC’07) (IEEE), 121–126. doi:10.1109/whc.2007.92

CrossRef Full Text | Google Scholar

Lecuyer, A., Coquillart, S., Kheddar, A., Richard, P., and Coiffet, P. (2000). “Pseudo-haptic Feedback: Can Isometric Input Devices Simulate Force Feedback?,” in Proceedings IEEE Virtual Reality 2000 (Cat. No.00CB37048) (IEEE), 83–90.

Google Scholar

Lécuyer, A. (2009). Simulating Haptic Feedback Using Vision: A Survey of Research and Applications of Pseudo-haptic Feedback. Presence Teleoperators Virtual Environ. 18, 39–53. doi:10.1162/pres.18.1.39

CrossRef Full Text | Google Scholar

Lee, B., Deng, Q., Hoggan, E., and Oulasvirta, A. (2017). “Boxer: A Multimodal Collision Technique for Virtual Objects,” in Proceedings of the 19th ACM International Conference on Multimodal Interaction (New York, NY: Association for Computing Machinery), 252–260. doi:10.1145/3136755.3136761

CrossRef Full Text | Google Scholar

Lee, Y., Jang, I., and Lee, D. (2015). “Enlarging Just Noticeable Differences of Visual-Proprioceptive Conflict in Vr Using Haptic Feedback,” in 2015 IEEE World Haptics Conference (WHC) (IEEE), 19–24. doi:10.1109/whc.2015.7177685

CrossRef Full Text | Google Scholar

Liao, Y.-C., Kim, S., Lee, B., and Oulasvirta, A. (2020). “Button Simulation and Design via Fdvv Models,” in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 1–14. doi:10.1145/3313831.3376262

CrossRef Full Text | Google Scholar

Liao, Y.-C., Kim, S., and Oulasvirta, A. (2018). “One Button to Rule Them All,” in The 31st Annual ACM Symposium on User Interface Software and Technology Adjunct Proceedings (New York, NY: Association for Computing Machinery), 111–113. doi:10.1145/3266037.3266118

CrossRef Full Text | Google Scholar

Lo, J.-Y., Huang, D.-Y., Sun, C.-K., Hou, C.-E., and Chen, B.-Y. (2018). Rollingstone: Using Single Slip Taxel for Enhancing Active Finger Exploration with a Virtual Reality Controller. New York, NY: Association for Computing Machinery, 839–851. doi:10.1145/3242587.3242627

CrossRef Full Text | Google Scholar

Lopes, P., and Baudisch, P. (2013). “Muscle-propelled Force Feedback,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 2577–2580. doi:10.1145/2470654.2481355

CrossRef Full Text | Google Scholar

Lopes, P., Ion, A., and Baudisch, P. (2015). “Impacto,” in Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (New York, NY: Association for Computing Machinery), 11–19. doi:10.1145/2807442.2807443

CrossRef Full Text | Google Scholar

Lopes, P., You, S., Cheng, L.-P., Marwecki, S., and Baudisch, P. (2017). “Providing Haptics to Walls & Heavy Objects in Virtual Reality by Means of Electrical Muscle Stimulation,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (New York, NY, USA: Association for Computing Machinery), 1471–1482. doi:10.1145/3025453.3025600

CrossRef Full Text | Google Scholar

Lubos, P., Bruder, G., Ariza, O., and Steinicke, F. (2016). “Touching the Sphere,” in Proceedings of the 2016 Symposium on Spatial User Interaction (New York, NY: Association for Computing Machinery), 13–22. doi:10.1145/2983310.2985753

CrossRef Full Text | Google Scholar

Maereg, A. T., Nagar, A., Reid, D., and Secco, E. L. (2017). Wearable Vibrotactile Haptic Device for Stiffness Discrimination during Virtual Interactions. Front. Robot. AI 4, 42. doi:10.3389/frobt.2017.00042

CrossRef Full Text | Google Scholar

Marchal, M., Gallagher, G., Lécuyer, A., and Pacchierotti, C. (2020). “Can Stiffness Sensations Be Rendered in Virtual Reality Using Mid-air Ultrasound Haptic Technologies?,” in Haptics: Science, Technology, Applications. Editors I. Nisky, J. Hartcher-O’Brien, M. Wiertlewski, and J. Smeets (Springer International Publishing), 297–306. doi:10.1007/978-3-030-58147-3_33

CrossRef Full Text | Google Scholar

Metzger, A., and Drewing, K. (2015). “Haptically Perceived Softness of Deformable Stimuli Can Be Manipulated by Applying External Forces during the Exploration,” in 2015 IEEE World Haptics Conference (WHC) (IEEE), 75–81. doi:10.1109/whc.2015.7177694

CrossRef Full Text | Google Scholar

Miyamoto, N., Aoyama, K., Furukawa, M., Maeda, T., and Ando, H. (2015). Air Tap: The Sensation of Tapping a Rigid Object in Mid-air. Tokyo: Springer Japan, 285–291. doi:10.1007/978-4-431-55690-9_52

CrossRef Full Text | Google Scholar

Napier, J. R., and Tuttle, R. H. (1993). Hands. Princeton University Press.

Google Scholar

Nilsson, N. C., Zenner, A., Simeone, A. L., Johnsen, K., Sandor, C., and Billinghurst, M. (2021). Propping up Virtual Reality with Haptic Proxies. IEEE Comput. Grap. Appl. 41, 104–112. doi:10.1109/MCG.2021.3097671

CrossRef Full Text | Google Scholar

Oulasvirta, A., Kim, S., and Lee, B. (2018). “Neuromechanics of a Button Press,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 1–13. doi:10.1145/3173574.3174082

CrossRef Full Text | Google Scholar

Parise, C. V. (2016). Crossmodal Correspondences: Standing Issues and Experimental Guidelines. Multisens. Res. 29, 7–28. doi:10.1163/22134808-00002502

PubMed Abstract | CrossRef Full Text | Google Scholar

Pezent, E., Israr, A., Samad, M., Robinson, S., Agarwal, P., Benko, H., et al. (2019). “Tasbi: Multisensory Squeeze and Vibrotactile Wrist Haptics for Augmented and Virtual Reality,” in 2019 IEEE World Haptics Conference (WHC) (IEEE), 1–6. doi:10.1109/whc.2019.8816098

CrossRef Full Text | Google Scholar

Pezent, E., O'Malley, M. K., Israr, A., Samad, M., Robinson, S., Agarwal, P., et al. (2020). “Explorations of Wrist Haptic Feedback for Ar/vr Interactions with Tasbi,” in Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (IEEE), 1–4. doi:10.1145/3334480.3383151

CrossRef Full Text | Google Scholar

Pfeiffer, M., Schneegass, S., Alt, F., and Rohs, M. (2014). “Let Me Grab This,” in Proceedings of the 5th Augmented Human International Conference (New York, NY: Association for Computing Machinery). doi:10.1145/2582051.2582099

CrossRef Full Text | Google Scholar

Reed, C. L., and Ziat, M. (2018). Haptic Perception: From the Skin to the Brain. Elsevier. doi:10.1016/B978-0-12-809324-5.03182-5

CrossRef Full Text | Google Scholar

Rietzler, M., Geiselhart, F., Gugenheimer, J., and Rukzio, E. (2018). “Breaking the Tracking,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 1–12. doi:10.1145/3173574.3173702

CrossRef Full Text | Google Scholar

Rock, I., and Victor, J. (1964). Vision and Touch: An Experimentally Created Conflict between the Two Senses. Science 143, 594–596. doi:10.1126/science.143.3606.594

PubMed Abstract | CrossRef Full Text | Google Scholar

Samad, M., Gatti, E., Hermes, A., Benko, H., and Parise, C. (2019). “Pseudo-Haptic Weight,” in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 1–13. doi:10.1145/3290605.3300550

CrossRef Full Text | Google Scholar

Schneider, D., Otte, A., Gesslein, T., Gagel, P., Kuth, B., Damlakhi, M. S., et al. (2019). Reconviguration: Reconfiguring Physical Keyboards in Virtual Reality. IEEE Trans. Vis. Comput. Graph. 25, 3190–3201. doi:10.1109/tvcg.2019.2932239

PubMed Abstract | CrossRef Full Text | Google Scholar

Schorr, S. B., and Okamura, A. M. (2017). “Fingertip Tactile Devices for Virtual Object Manipulation and Exploration,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 3115–3119. doi:10.1145/3025453.3025744

CrossRef Full Text | Google Scholar

Sinclair, M., Ofek, E., Gonzalez-Franco, M., and Holz, C. (2019). “CapstanCrunch,” in Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New York, NY: Association for Computing Machinery), 815–829. doi:10.1145/3332165.3347891

CrossRef Full Text | Google Scholar

Steinicke, F., Bruder, G., Jerald, J., Frenz, H., and Lappe, M. (2010). Estimation of Detection Thresholds for Redirected Walking Techniques. IEEE Trans. Vis. Comput. Graph. 16, 17–27. doi:10.1109/TVCG.2009.62

PubMed Abstract | CrossRef Full Text | Google Scholar

Strasnick, E., Holz, C., Ofek, E., Sinclair, M., and Benko, H. (2018). “Haptic Links,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 1–12. doi:10.1145/3173574.3174218

CrossRef Full Text | Google Scholar

Sun, Y., Yoshida, S., Narumi, T., and Hirose, M. (2019). Pacapa: A Handheld Vr Device for Rendering Size, Shape, and Stiffness of Virtual Objects in Tool-Based Interactions. New York, NY, USA: Association for Computing Machinery, 1–12. doi:10.1145/3290605.3300682

CrossRef Full Text | Google Scholar

Sutherland, I. E. (1965). The Ultimate Display. Proc. IFIP Congr. 2, 506–509.

Google Scholar

Takahashi, A., Brooks, J., Kajimoto, H., and Lopes, P. (2021). “Increasing Electrical Muscle Stimulation's Dexterity by Means of Back of the Hand Actuation,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery). doi:10.1145/3411764.3445761

CrossRef Full Text | Google Scholar

Takahashi, A., Tanabe, K., and Kajimoto, H. (2019). Haptic Interface Using Tendon Electrical Stimulation with Consideration of Multimodal Presentation. Virtual Real. Intelligent Hardw. 1, 163–175. doi:10.3724/sp.j.2096-5796.2019.0011

CrossRef Full Text | Google Scholar

Takahashi, A., Tanabe, K., and Kajimoto, H. (2018). “Relationship between Force Sensation and Stimulation Parameters in Tendon Electrical Stimulation,” in Haptic Interaction. Editors S. Hasegawa, M. Konyo, K.-U. Kyung, T. Nojima, and H. Kajimoto (Springer Singapore), 233–238. doi:10.1007/978-981-10-4157-0_40

CrossRef Full Text | Google Scholar

Tamaki, E., Miyaki, T., and Rekimoto, J. (2011). “PossessedHand,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 543–552. doi:10.1145/1978942.1979018

CrossRef Full Text | Google Scholar

Verschoor, M., Casas, D., and Otaduy, M. A. (2020). Tactile Rendering Based on Skin Stress Optimization. ACM Trans. Graph. Proc. ACM SIGGRAPH) 39, 1. doi:10.1145/3386569.3392398

CrossRef Full Text | Google Scholar

Whitmire, E., Benko, H., Holz, C., Ofek, E., and Sinclair, M. (2018). “Haptic Revolver: Touch, Shear, Texture, and Shape Rendering on a Reconfigurable Virtual Reality Controller,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (New York, NY: Association for Computing Machinery), 1–12. doi:10.1145/3173574.3173660

CrossRef Full Text | Google Scholar

Yang, J., Horii, H., Thayer, A., and Ballagas, R. (2018). “VR Grabbers,” in Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (New York, NY: Association for Computing Machinery), 889–899. doi:10.1145/3242587.3242643

CrossRef Full Text | Google Scholar

Yem, V., Vu, K., Kon, Y., and Kajimoto, H. (2018). “Effect of Electrical Stimulation Haptic Feedback on Perceptions of Softness-Hardness and Stickiness while Touching a Virtual Object,” in 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (Reutlingen, Germany: IEEE), 89–96. doi:10.1109/vr.2018.8446403

CrossRef Full Text | Google Scholar

Yu, R., and Bowman, D. A. (2020). Pseudo-haptic Display of Mass and Mass Distribution during Object Rotation in Virtual Reality. IEEE Trans. Vis. Comput. Graph. 26, 2094–2103. doi:10.1109/tvcg.2020.2973056

PubMed Abstract | CrossRef Full Text | Google Scholar

Yuan, Y., and Steed, A. (2010). “Is the Rubber Hand Illusion Induced by Immersive Virtual Reality?,” in Proceedings of the 2010 IEEE Virtual Reality Conference (IEEE Computer Society), 95–102. doi:10.1109/VR.2010.5444807

CrossRef Full Text | Google Scholar

Zenner, A., and Krüger, A. (2019). “Estimating Detection Thresholds for Desktop-Scale Hand Redirection in Virtual Reality,” in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (Osaka, Japan: IEEE), 47–55. doi:10.1109/vr.2019.8798143

CrossRef Full Text | Google Scholar

Zilles, C. B., and Salisbury, J. K. (1995). “A Constraint-Based God-Object Method for Haptic Display,” in Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots (Pittsburgh, PA, USA: IEEE), 146–151.

Google Scholar

Keywords: haptics, proprioceptive feedback, stiffness, virtual reality, tactile feedback, pseudo-haptics

Citation: Ariza Nunez OJ, Zenner A, Steinicke F, Daiber F and Krüger A (2022) Holitouch: Conveying Holistic Touch Illusions by Combining Pseudo-Haptics With Tactile and Proprioceptive Feedback During Virtual Interaction With 3DUIs. Front. Virtual Real. 3:879845. doi: 10.3389/frvir.2022.879845

Received: 20 February 2022; Accepted: 11 April 2022;
Published: 08 June 2022.

Edited by:

Kazuma Aoyama, The University of Tokyo, Japan

Reviewed by:

Yuki Ban, The University of Tokyo, Japan
Takuto Nakamura, Tokyo Institute of Technology, Japan

Copyright © 2022 Ariza Nunez, Zenner, Steinicke, Daiber and Krüger. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Oscar Javier Ariza Nunez, b3NjYXIuamF2aWVyLmFyaXphLm51bmV6QHVuaS1oYW1idXJnLmRl; André Zenner, YW5kcmUuemVubmVyQGRma2kuZGU=

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.