- 1Faculty of Engineering and Science, Department of Information and Communication Technology, University of Agder, Grimstad, Norway
- 2Faculty of Engineering and Science, Department of Engineering Sciences, University of Agder, Grimstad, Norway
Virtual reality (VR) technology is a promising tool in physical rehabilitation. Research indicates that VR-supported rehabilitation is beneficial for task-specific training, multi-sensory feedback, diversified rehabilitation tasks, and patient motivation. Our first goal was to create a biomechatronics laboratory with a VR setup for increasing immersion and a motion platform to provide realistic feedback to patients. The second goal was to investigate possibilities to replicate features of the biomechatronics laboratory in a home-based training system using commercially available components. The laboratory comprises of a motion platform with 6-degrees-of-freedom (Rexroth eMotion), fitted with a load cell integrated treadmill, and an Oculus Quest virtual reality headset. The load cells provide input for data collection, as well as VR motion control. The home-based rehabilitation system consists of a Nintendo Wii Balance Board and an Oculus Rift virtual reality headset. User studies in the laboratory and home environment used direct observation techniques and self-reported attitudinal research methods to assess the solution’s usability and user experience. The findings indicate that the proposed VR solution is feasible. Participants using the home-based system experienced more cybersickness and imbalance compared to those using the biomechatronics laboratory solution. Future studies will look at a setup that is safe for first patient studies, and exercises to improve diagnosis of patients and progress during rehabilitation.
1 Introduction
Virtual reality (VR) is a promising tool in the rehabilitation of neurological conditions, such as stroke, Parkinson’s disease and traumatic brain injury (Cano Porras et al., 2018). Patients with neurological conditions often require rehabilitation in the early stages, and some require rehabilitation regularly throughout their lives. Problems with balance and gait are some of the challenges which often limit these patients in everyday life (Darekar et al., 2015). Generally, patients undergo rehabilitation at a facility and follow self-guided rehabilitation programs at home. In the last decade, research has focused on increasing motivation by creating multi-sensory VR rehabilitation programs (Cano Porras et al., 2019). This is because research has shown that, especially self-guided programs, can be tedious and demotivating (Howard, 2017; Koenig et al., 2019). The inclusion of tools such as treadmills, motion platforms, and sensors aims at creating more fun, effective, and task-specific rehabilitation programs (Shema et al., 2014; Kern et al., 2019).
A collaboration between teams with experience in multimedia technology and mechatronics engineering developed a prototype of a biomechatronics laboratory at the University of Agder. The biomechatronics lab consists of a 6 degrees of freedom (6DOF) Rexroth motion platform, a treadmill integrated with sensors and a virtual reality headset. A home-based system was developed as a prototype for exploring the possibility of using the system at home (Madshaven and Markseth, 2020). We implemented the system through a human-centred design process including stakeholder involvement and user-based testing. The prototypes were developed as a proof-of-concept as a starting point for future development. This research was conducted on healthy adults. The long-term motivation for the research is aimed at rehabilitation of balance and gait problems in patients with cognitive impairment.
The paper is organized as follows: A review of the research work regarding the possibility of improving rehabilitation using VR is given in Section 2. In Section 3, the realization of a biomechatronics laboratory platform. In Section 4, a simplified home version of the platform is presented. User tests and preliminary results are showcased in Section 5. Finally, proposed future work and planned activities are discussed in Section 6.
2 Technology Supported Rehabilitation
2.1 Rehabilitation and Virtual Reality
Physical rehabilitation helps people who have difficulties with gait, balance and mobility towards living a normal life. Therapists create an individual plan for each patient that may include massages, balance and gait training, pain management, etc.1 A program can focus on various exercises, such as stability and strength, balance, gait etc., and the number of repetitions depending on the patients capacity. Starting with a small number of repetitions and slowly and steadily increasing the number (Stoek et al., 2016). Many patients need treatment in a medical facility and a long-term self-guided rehabilitation program (Lawo and Knackfß, 2018). Patients often struggle with keeping up the motivation when having to exercise regularly alone Koenig et al. (2019), resulting in low adherence to self-guided programs (Proffitt and Lange, 2015; Kleim et al., 2019).
VR has a potential of creating specific task-related training scenarios for situations that could be impossible or even dangerous for patients to perform in real life (Weiss et al., 2003; Holden, 2005; Brtsch et al., 2010; Kalron et al., 2016). VR has been used in the rehabilitation of several medical problems, and VR design varies from game-like activities (Gil-Gmez et al., 2011; Seo et al., 2016) to task-specific everyday activities such as kitchen work (Koenig et al., 2019), grocery shopping (Kizony et al., 2010) or simply crossing a street (Weiss et al., 2003). These task-specific exercises are especially beneficial for patients with cognitive impairments (Lawo and Knackfuß, 2018). With VR, patients interact with a computer-generated world while doing rehabilitation exercises. Virtual reality rehabilitation (VRR) has been shown to improve problems with balance, gait, motor control and strength (Howard, 2017), even more so when combined with traditional rehabilitation (Cano Porras et al., 2018). Utilizing VR in rehabilitation has been shown to have benefits for both patients and therapists. A therapist evaluates each environment’s safety, controls what stimulus patients receive during an exercise in the environment, and chooses what the patient can interact within the virtual environment, such as other people or objects (Schiza et al., 2019; Lubetzky et al., 2020), while training in a safe environment.
According to Howard (2017) and Cano Porras et al. (2018), using VR in rehabilitation increases motivation, the feeling of enjoyment, and adherence to the program. Using VR in rehabilitation offers individual adaptation (Koenig et al., 2019), since VR offers a vast amount of design possibilities. With individual adaption, the exercises and environment can be designed to fit each patients needs and capacity. Studies show that conventional rehabilitation exercises can be transferred to VR (Proffitt and Lange, 2015; Kern et al., 2019). It also opens up the possibility of home-based rehabilitation programs. What motivates and interests a person varies between individuals. By using the benefits of VR, individualization of rehabilitation programs can be created to suit each patient. For example, a person post-stroke may be interested in fishing but, due to health challenges, can’t go fishing until enough mobility in the arms is regained. By using VR, it is possible to create a fishing game that would be fun, motivating and support the rehabilitation of such patients. Fun and immersive VR exercises could appeal to each individual better than the repetitive execution of abstract movements without any feedback and enjoyment that VR can potentially offer (Keshner and Fung, 2017).
VR has the potential to give multi-sensory feedback (Cano Porras et al., 2019), such as sight, hearing, and movement to make the experience more immersive, engaging, and motivating for the patients (Kourtesis et al., 2019). Table 1 presents the different senses along with the feedback and the tools that display or creates the feedback. The two primary senses are used to monitor the external environment while the proprioception and the vestibular sense are related to the body’s awareness of position, movement of limbs and muscles, and balance (Tuthill and Azim, 2018). To facilitate changes in proprioceptive cues different surfaces, some of which are mentioned in chapter 2.2 could be used.
2.2 Motion Platforms and Sensors in Rehabilitation
Tools such as camera-based systems, treadmills, force plates, and motion platforms can enhance rehabilitation exercises. For example, the CAREN2 (Computer Assisted Rehabilitation Environment) system is a biomechanics lab, consisting of a treadmill mounted on a 6 degrees of freedom motion platform and a dome that projects a virtual environment. The lab, created by Motek Medical,3 adds the benefit of ground movement, such as when walking down a slope or standing on a ship at sea. CAREN has proven to be an effective method in balance training (Kalron et al., 2016) for rehabilitation of people with multiple sclerosis, as well as in rehabilitation of gait and balance problems (Sessoms et al., 2015; Cano Porras et al., 2019).
A wide range of sensors is beneficial in a rehabilitation process, such as electromyography for muscle activity measuring, and inertial measurements units for gait cycle and postural balance measurements (Dobkin, 2013). These sensors are used to measure progress and adjust the rehabilitation process to fit the patient’s level, thus improving their user experience. Further on, data collected by those sensors can be visually presented to the patient in VR to enhance the user experience.
2.3 Combining Motion Sensors and Virtual Reality Solutions in Rehabilitation
One of the most reported VR strengths is ecological validity, the degree of similarity of the training to the real world and the potential of the learned skills to be transferred to the patients’ everyday lives (Koenig et al., 2019). The authors use an example of making breakfast in a virtual kitchen. A patient prepares food, looks for utensils, makes coffee and toast, listens to the weather forecast, and gets interrupted while doing these tasks by taking phone calls. All of these tasks can be accomplished in VR, both visually and kinesthetically. The kinematic data gathered from the motion sensors, the behavioral data, and log files contribute to a large set of data that can be analyzed for further progress of the patient. Koenig et al. (2019) also highlighted the benefit of adding social interaction. Social interactions can enrich the experience and make the experience more related to the patient’s everyday life. Some parts of social interaction involve distractions and disturbances, and patients will need to be prepared for situations like these. Kern et al. (2019) created a VRR program intending to motivate patients to walk for an extended period using a treadmill and VR headset. They created an engaging story with characters where the goal was to rebuild the home of a dog companion through walking. As the patient walks the world rebuilds, visual and auditory cues and small animations enrich the game. The study concluded that gamification elements (i.e., appealing storyline, rewards, and social interaction) increased motivation in gait rehabilitation. The study also reported a higher level of well-being in terms of user satisfaction, anxiety, and simulator sickness than the participants in non-VR conditions.
In a study by Proffitt and Lange (2015), a home-based rehabilitation program was created using a Microsoft Kinect, monitor and PC. The system was tested on four participants, where three out of four found the program usable. The authors concluded that one potential issue with using VR in self-guided rehabilitation programs is that a certain level of technical knowledge is needed to operate the system. Nonetheless, using such technology could be useful in self-guided programs, as the technology seems to increase the amount of fun and motivation. It can also gather data on patient progress, facilitating therapists to tailor rehabilitation programs to specific individual needs.
2.4 Virtual Reality and Neuroplasticity
Neuroplasticity is the brain’s ability to change throughout our lives as a result of experience. Neuroplasticity helps us learn new skills, enhance existing capabilities, as well as aid recovery after loss of functionality, i.e loss of speech, motor control and balance from cognitive impairments. When the brain is damaged, by illness or accidents, the brain has the ability to change and reform neural connections. The brain can reorganize its structure according to the environment (Lawo and Knackfß, 2018). In rehabilitation of impairments related to the brain, such as stroke or Parkinsons disease, therapists try to induce neuroplasticity. For example, a patient suffering from loss of function on one side of the body after a stroke may rearrange or create new neural pathways which were lost with intensive rehabilitation. Rearranging these neurons only happens with the right stimulus and sustainable motivation (Lawo and Knackfß, 2018). Stronger connections in the brain are made by doing the tasks regularly. Rehabilitation starts in the hospital or in a clinic, and usually continues at home after discharge. Lawo and Knackfß (2018) suggest utilizing easy-to-use technology for patients to engage in self-guided programs. In the article they present a set of features and functionalities required for an ideal home solution are presented. One of which is exercises within a personalized, serious-game-based rehabilitation program in a virtual environment. In Table 2, Kleim and Jones (2008) list of principles of experience-dependent plasticity are coupled with VR possibilities suggested by the authors which may facilitate neuroplasticity.
TABLE 2. Based on Kleim and Jones (2008): Principles of experience-dependent plasticity further developed to include VR possibilities.
2.5 Cybersickness
The feeling when sitting in a car that is standing still while the car besides starts moving is often a little confusing, nauseous and discomforting. That feeling can occur while using VR as well. There are many terms for this phenomenon. Some call it cybersickness, others call it motion sickness and simulator sickness (LaViola, 2000; Davis et al., 2014). These terms are similar, but with different triggers and some different symptoms. Motion sickness could be triggered by rollercoasters, driving cars and other real-world experiences, and cybersickness (discussed in this work) is triggered by VR. Discomfort, nausea, sweating, fatigue and headache are some of the symptoms of cybersickness (LaViola, 2000). To the best of our knowledge, it is still challenging to define specific characteristics of people whose cybersickness can be triggered by VR. There are three main theories as to why someone gets cybersickness: 1) the Sensory Conflict Theory, 2) the Poison Theory and 3) the Postural Instability Theory (LaViola, 2000). The Sensory Conflict theory is based on the vestibular sense and the visual sense having a conflict with each other, which could trigger motion sickness. This theory is not a feature of VR, and can be triggered in everyday life. For example when movement is seen, but not felt.
Kennedy et al. (1993) created the Simulator Sickness Questionnaire (SSQ), which is the most commonly used questionnaire when investigating motion sickness. It can also be used for cybersickness. The SSQ contains 16 symptoms, scaled from zero (no symptoms) to three (severe symptoms). It is taken two times, once before testing to determine the patient’s health and again right after testing to see if the patient experienced any symptoms.
Different measures can be taken to prevent or at least reduce the risk of triggering cybersickness when using VR. Studies have shown that pleasant music and aroma decrease the risk of cybersickness (Keshavarz and Hecht, 2014; Keshavarz et al., 2015). Research also indicates that having a virtual nose could alleviate cybersickness (Wienrich et al., 2018). Providing a multisensory VR environment, such as including movement, could also reduce the mismatch between the senses. Chang et al. (2020) proposed in their literature review a multimodal fidelity hypothesis, which suggests that minimizing the mismatch between sensory information could help alleviate cybersickness. In the study by Plouzeau et al. (2015), the level of cybersickness lessened when the participants experienced vibrations corresponding to the visual stimulus in VR. Hardware is another critical factor in preventing cybersickness. Aspects that need to be considered are refresh rate, Field Of View (FOV), screen size, and resolution. Kourtesis et al. (2019) researched different types of technology used in VRR, and their research resulted in a list of suggested technical standards, which is presented in Table 3.
The severity of cybersickness varies among users. Factors such as age, gender, emotional and psychological state, health and previous experience with VR can all affect susceptibility to cybersickness (LaViola, 2000; Davis et al., 2014). Some people naturally have a higher degree of cybersickness susceptibility, which might be attributed to the difference in depth perception (Benzeroual and Allison, 2013).
3 A Biomechatronics Lab for Rehabilitation Studies
Haptic rendering, associated with virtual/augmented reality, is an emerging term within the field of gaming (Tokuyama et al., 2019), robot control (Liu et al., 2019), training/rehabilitation (Bortone et al., 2018), and human-machine interaction (Sanfilippo et al., 2015; Sanfilippo and Pacchierotti, 2018; Sanfilippo and Pacchierotti, 2020). Haptic rendering is the process of computing and generating forces in response to user interactions with virtual objects (Salisbury et al., 1995). Concerning physiotherapy, haptic rendering has been showing indications of enhancing learning progression and user experience in people with reduced motor skills due to stroke (Yeh et al., 2017; Pareek et al., 2018). With that being said, haptic rendering is still a relatively new term in the context of physiotherapy and needs further researching to be properly exploited.
Lack of the sensation of physical interaction is the main factor differentiating a virtual world from the real world and the gap between those worlds limits the use of an artificial environment in areas other than gaming. It is of interest to couple the two worlds by reproducing virtual, dynamic interactions mechanically, thus reducing the gap between the artificial world and the real world. Since it is all about recreating forces to mimic interactions within the virtual world, haptic rendering can be thought of as a gateway between the virtual world and the physical world. The physiotherapeutic benefits of immersing a person in a virtual world by the aid of haptic rendering are, among others, physical engagement of patients, keeping the training on a stationary rig for optimal monitoring and assessment, the possibility to create controlled, user-specific exercises and difficulties, and to include objectives and rewards in the training routines. However, including haptic rendering in physiotherapy also poses challenges. The physical structures of the mechanical setups can feel intimidating, depending on the size and complexity of the machine. Moreover, people in need of rehabilitation often have mobility problems, which highly affects the physical design and operation criteria of the machine. Also, there is always a safety concern in situations where a machine is in direct collaboration with a human, especially in cases where the purpose of the machine is to act a force on the human. A robot should not be able to harm any human, which means many considerations must be made when designing and developing a haptic render-based rehabilitation system for physiotherapists and home-use. These considerations, along with ethical problems and cost-over-function, are all factors contributing to limiting its current occurrences as tools used by health experts. This limitation of usage affects research on the topic and impairs the advancement within the field. To contribute to the study of using haptic rendering for rehabilitation and training, a proof-of-concept haptic rendering system was developed for the purpose of enabling lower limb assessment, training and rehabilitation in cooperation with an interactive, virtual world (Jomås and Lien, 2020).
3.1 Mechanical Setup
To model the physical rendering of a dynamic environment, we used a 6 DOF Stewart platform (Dasgupta and Mruthyunjaya, 2000) employed at the Norwegian Motion Laboratory at the University of Agder (UiA). The Stewart platform is a Rexroth eMotion 1,500, which is specifically designed for simulation purposes.4 A treadmill is mounted on top of the motion platform to study locomotion performance in patients with complications concerning lower limb movements. The treadmill is equipped with embedded load sensors to measure load location and magnitude. The selected sensors enable measurements similar to the ones that can be collected with clinical force plates. The possibility of performing such measurements is important for the assessment of biomechanical motion and balance (Mansfield and Inness, 2015), and to enable the patient to directly influence the motion of the platform. The treadmill is also equipped with an industrial drive inverter to attain full control over the treadmill belt’s velocity, contributing to personalize rehabilitation programs. Moreover, a state-of-the-art motion tracking system from Qualisys is mounted on the ceiling above the motion platform to provide real-time visual monitoring of the skeletal structure during dynamic exercises.5 The selected camera system enables researchers and practitioners to conduct diagnosis and real-time assessment of the rehabilitation process. A safety harness frame with railings encapsulates the treadmill which is designed to support the patients during operation and to catch them as soon as possible in case of a fall, preventing them from falling onto the treadmill or off the rig. In addition, an emergency stop is made easily available for both the practitioner and patient. Strict safety assurance and measures, as well as emerging standards in this area, need to be considered given that the system comprises human-machine collaboration. A visual description of the components comprised by the biomechatronics lab can be seen in Figure 1.
3.2 Dynamic Model of Surface Interaction
To enhance the immersive feeling of virtual reality, a dynamic model of surface interaction is implemented into the platform’s control flow. The purpose of the dynamic model is to mimic the response of a surface when “physically” interacted with in the virtual world. The dynamic response of the model can be adjusted to represent specific balancing scenarios such as snowboarding, skateboarding, surfing, and also walking scenarios such as walking on an air mattress, on soft grass or on concrete. Depending on the patient’s location and pressure magnitude on the treadmill, the model will provide the Stewart platform with commands to make it simulate the interaction between the patient and the corresponding virtual surface. This setup enables the eligible practitioner/operator to fully control the platform’s motion or to include patient intervention by passing control signals through the model. This makes the setup a suitable environment for assessment and rehabilitation where specific and personalized movements can be generated. The control architecture and interaction-model are developed in LabVIEW - a software that offers a graphical programming approach.6 The LabVIEW program is deployed onto a MyRIO student embedded device, which supports both LabVIEW graphical code or C.7 Although LabVIEW is ideal for quick-paced proof-of-concept programming, the architecture quickly becomes indecipherable for larger programs containing multiple statements and loops. Therefore, more traditional text-based programming languages might be preferable to use at a later stage.
The haptic model utilizes input data from four load cells one mounted in each corner of the treadmill, and also the coordinates of the topography of the virtual surface at the feet of the virtual, first person character within the virtual world. Based on trigonometry, the location and magnitude of the weight applied by the person on the treadmill are estimated. The haptic model consists of four mass-spring-damper submodels interconnected by two rectangular surfaces within a three-dimensional space, as illustrated in Figure 2.
The haptic model, also referred to as board model, is developed by using State-Space architecture in LabVIEW. As shown in Figure 2, the location and magnitude of the weight applied by the user act on the upper surface while the lower surface is mimicking the virtual topography. Then, the coordinates of the upper surface (the board) are transmitted to the Stewart platform for physical rendering of the virtual board’s motion. Each mass-spring-damper model can be individually modified thus enabling the possibility to render multiple different surface properties depending on the desired scenario. For the test run, the tilt angle of the board model relative to absolute coordinates determined the virtual turn rate, much like riding a snowboard in powdery snow. The information flow between the different components within the system, including the data processing unit containing the haptic model, is shown in Figure 3. The data processing unit processes information from sensors and the virtual world, couples them through the haptic model and provides the user with a hapto-audio-visual experience by actuating the platform, generating sound and displaying a virtual scenario correspondingly.
FIGURE 3. The information flow between the different components within the system and the controller containing the haptic model.
3.3 Virtual Reality Environment and Implementation
Unity is a real-time professional 3D development platform with built-in IDE.8 In this project, it is used to generate and render a virtual world, and to provide information about the topography necessary for the Stewart platform to mimic the surface pose in the real world. The Unity VR application and the LabVIEW program exchange information via a middleware gateway realized in Python. The topography information, along with the patient’s intervention through the haptic interface, is then handled by the controller to provide a set of coordinates for the Stewart platform’s pose. This is a continuous, real-time process that generates an immersive fidelity of the virtual world.
The VR environment consists of a small city with an intersection. Along the road, there are different obstacles like speed bumps, slopes, rocks and other bumps that the patient has to overcome to obtain all the stars that are scattered along the road. The gameplay is simple, collect all the stars and try to maintain your balance. To collect a star the patient has to navigate the character in the scenario directly into the star, which, when caught, triggers a distinct audio to indicate that the star is indeed captured. To test the balance of the patients, the possibility of cybersickness and the biomechatronic lab’s functionality, the stars are strategically placed along the road. For example one of the stars is placed at a dead end so that the patient had to turn 180 degrees to go back and try another path, while other stars were placed on the slopes and bumps so the patient had to navigate over them. Currently, there are no points calculation or time measurements in the scenario, but in the future, these aspects could be added. The initial plan was that the users would walk on the treadmill and lean to either side to turn. This could not be implemented because of restrictions that prevented access to the biomechatronics lab facilities. Instead, the biomechatronics lab operator manually sets a speed to the character in VR and the user can navigate and turn by leaning, similar to skateboarding. The VR scenario of skateboarding was discussed with therapists before development (Madshaven and Markseth, 2020) and consideration was given to related previous work involving patients (Liao et al., 2018). In Table 4 the therapeutic aim and the game design choices are displayed. A part of the rehabilitation of stroke patients is focused on sensory stimuli and shifting weight on the legs from side to side. In discussion with the therapists, patients at their facility sometimes use a Nintendo Wii Balance Board (WBB) with Wii Fit in weight-bearing and balance training. For now, such training is minimal and only for fun because, according to the therapists, the patients tend to have a higher focus on winning the game rather than performing the movements correctly. This may have a negative impact on the therapeutic progression and could set back the progression obtained through traditional rehabilitation. Designing rehabilitation games to force the patient to do the right movement to “win” could make the use of such systems more suited. The scenario was developed to be used with the biomechatronics lab and the home version.
In Table 5, three different VR headsets which were available at our facilities at the time are presented with their specifications. Although the HTC Vive fulfilled most of the suggested standards, the headset utilized for the biomechatronics lab was the Oculus Quest. The reasoning behind this is that the Quest offered the best technical solution. It has internal sensors and it could be operated with a single USB-C type cable. The motion platform moves up and down approximately 1 m during operation, making accurate motion tracking challenging when using headsets that requires external sensors to function. The Oculus Quest is a standalone VR headset, it is wireless and does not require a computer to function. The HTC Vive can also function wirelessly. A wireless adapter can be purchased, but you still need a computer for the HTC Vive to function and external sensors for tracking, making it less portable than the Quest.
TABLE 5. Comparison of features on Oculus Quest9, Oculus Rift10,11 and HTC Vive (Borrego et al., 2018).
3.4 System Design Evaluation
The current biomechatronics lab was developed as a proof-of-concept to test the possibility and performance in terms of fidelity and immersiveness of combining VR and a physical motion platform. Even though the physical components used and how their assembly resulted in an unnecessarily over-dimensioned rig with limitations concerning usability and dynamic response, the tests indicated promising results regarding the increase of fidelity and immersiveness. Once engaged in the virtual world augmented by a sensible physical response, the mind quickly adopted artificial physics, and orientation felt increasingly natural and instinctive. The setup proves its use as a proof-of-concept platform for biomechanic motion assessment and training/rehabilitation, and will be used for further research within this field. In the future however, the development of a more compact and responsive solution to the mechanical system is desired to increase the ease of initiating assessment/rehabilitation and operation, but also to enhance the motion rendering.
4 A Simplified Home Version for Portability
A prototype of a simplified home version of the biomechatronics lab was created to explore the possibility of a portable rehabilitation system. Many people in Norway live in areas without a rehabilitation facility. Sometimes patients have to travel far to receive professional help. Therapists also supplement the rehabilitation with self-guided programs at home. Furthermore, because of COVID-19, patients had to rely more on in-home self-guided programs. Even though a home version has limitations on the movement dimension, it presents an affordable and portable solution that could be used in self-guided rehabilitation programs.
The home version consists of a WBB, Oculus Rift VR headset and a PC. The WBB is used for navigating the character in the VR environment. To do this the user has to lean forward, backward or to the sides. The same way as in the biomechatronics lab. There are four pressure sensors located inside the WBB that measure the user’s center of pressure (Clark et al., 2010). By shifting weight on the board, the WBB can be utilized as a game controller by converting the output data into keystrokes. An open-source program called the WiiBalanceWalker12 was used to handle this. The program connects the WBB through Bluetooth, and generates keypresses translating to the keys W, S, D and A or the keys up, down, left and right.
Figure 4 shows the portable home version. For this prototype, the plan was to use an Oculus Quest wirelessly connected to a WBB. However, it was not possible to connect the WBB directly to the Oculus Quest via Bluetooth. Hence a personal computer (PC) was considered to first connect the WBB, then connect a VR headset. At the time for user testing, the Quest was very limited in terms of which USB-C cables would work, so an Oculus Rift was used instead. Since then, the Oculus Quest had a new update that allows for connection using most types of USB-C cables. The VR environment is the same as used in the biomechatronics lab.
5 User Testing
5.1 Aims of the User Tests
The biomechatronics lab’s observational testing aimed at getting initial feedback on the system’s functionality and user experience. The user testing aims to test the home-based rehabilitation system’s usability, evaluate the user experience, and to see if this system could be used at home, outside of rehabilitation facilities.
5.2 Biomechatronics Lab
Due to Covid-19 and related university guidelines, the number of people and time allowed in the Motion Lab was restricted. This meant that we could not conduct user testing with external participants in the biomechatronics lab. Our time-slot for lab access had to accommodate for both the final system integration and an observational test. We tested on five people, three of them related to the project. All participants were men between the age of 25 and 40, three of them had minor previous experience with VR.
The methods used were direct observation in a controlled environment, followed by oral feedback. The participants voiced their thoughts about the user experience and the functionality of the system.
All participants were in the same room, at all times during user testing. One by one, the participants mounted the biomechatronics lab and got secured to the safety frame with a harness. When they were ready and comfortable, they were asked to collect six stars around the VR environment. During the test, the participants were encouraged to continuously express their thoughts about the experience, following the Think Aloud protocol.
The results from user testing of the biomechatronics lab show that the user experience was enjoyable and fun. None of the participants experienced any cybersickness. One participant said that he felt a little disoriented when looking down in the VR environment because he couldn’t see any legs. One of the participants was so immersed that he forgot how high up the motion platform was. Combining the biomechatronics lab with the VR system proved successful and the systems functioned together with no perceived delays.
5.3 Home-Based System
Five people participated in user testing, four men and one woman. The participants’ ages ranged from 30 to 50 years. Four of the participants reported that they had used VR before and had some experience with video games. All of the participants were technologically competent. Although this was a small sample size, a small number is often enough for assessment in usability studies (Lazar et al., 2017), which focus on finding potential usability problems. Two semi-structured interviews with closed-ended and open-ended questions were created to get information about the participants, what they thought about the VR solution, and to better understand the VR solution’s user experience and usability. The Simulator Sickness Questionnaire (SSQ) was administered to measure cybersickness symptoms. During the user test, notes were taken on how the participants were doing and what was said during testing.
Figure 5 shows the structure of the user tests. The test participants were brought into the test area and informed of the test’s procedure and their rights as participants. After the first interview and SSQ, the participants were instructed on using the equipment and had a test run. When the participants felt comfortable with the equipment and how to use it, they collected six stars along a test-track in the VR environment. During the test, participants’ activities were observed and the duration of given user tasks was recorded. After the test, the participants answered again the SSQ and responded to post-interview questions.
The findings from the participants’ post-interviews indicate that the participants found the home-based system easy to learn how to use, effective and efficient to use (in terms of technical usability). One of the participants remarked that catching the stars was easy but the path was difficult to navigate. Two of the participants found the system to be slightly frustrating with regards to the sensitivity of the WBB. All but one participant felt safe when testing the system, where one participant remarked that he would feel safer with nearby support in case of imbalance. Three out of five participants experienced no discomfort or symptoms of cybersickness during testing. The results from the SSQ (Table 6 and Figure 6) revealed that two of the participants experienced mild symptoms of cybersickness. The two youngest participants, between the age of 30 and 35, experienced mild dizziness, headache, eyestrain, and nausea. All the participants remarked having fun during testing.
FIGURE 6. Representation of SSQ results from Table 6: This shows how many participants experienced mild symptoms (Score of 1 or 2 in SSQ) of cybersickness.
5.4 Discussion
Given the small sample size in user testing, this work can not generalize the results pertaining to the incidences of cybersickness while using the developed systems. The Sensory Conflict Theory might explain why the biomechatronics lab did not induce any symptoms. In the biomechatronics lab, the motion platform moves with the environment seen in the VR headset, which could reduce the risk of triggering cybersickness (Chang et al., 2020). One of the participants mentioned that adding legs to the character could improve the system, thereof improving the user experience. When looking down in VR, it can be somewhat disorienting not seeing one’s legs. None of the participants involved in testing the home-based system mentioned this. The level of realism in the biomechatronics lab is higher than in the home-based system because of the added sense of movement. Providing more sensory information could make users expect a higher level of realism because of the increased immersion in the experience. Users can perform various rehabilitation tasks in a more realistic setting, with varying levels of difficulty in the biomechatronics lab. Given the possibilities to create varying challenges in the VR environments, it would be possible to make adjustments according to individual users’ responses, hence sustaining their engagement, motivation and a sense of control. Users familiar with sports and training facilities are likely to find it natural to use the biomechatronics lab, hence increasing the quality of overall experience and user satisfaction.
User testing results indicated that the home system might induce cybersickness in some individuals, and the WBB should be more responsive. Furtheron, the WBB does not add the sense of movement, which might be the reason why the home system-induced more symptoms of cybersickness than the biomechatronics lab. The WBB was calibrated to each individual. Moving ones feet from the calibrated area could contribute to the lack of sensitivity experienced by some of the participants. Potential patients may need to calibrate the WBB themselves for home-based rehabilitation which may lead to low sensitivity and accuracy.
As Bonnechre et al. (2015) suggested, there is a high correlation between using laboratory force plates and WBB in balance assessment. Although the WBB presents a viable low-cost portable solution to laboratory force plates, our user tests suggest that some users may find it difficult to use as a game controller. Based on our results in testing the home-based system, we suggest using new and improved hardware in terms of balance boards.
6 Conclusion
Two virtual reality supported rehabilitation solutions were developed and user-tested. One was a biomechatronics lab consisting of a 6 degrees of freedom motion platform, a treadmill with embedded load sensors for load location and magnitude, a motion tracking system, and virtual reality system for interaction in a virtual environment (Supplementary Video 1). The other solution was a home-based system consisting of a Nintendo Wii Balance Board with pressure sensors for balance assessment and navigation in 3D with an Oculus Rift. Both solutions, developed as proof-of-concept prototypes, were tested with a small number of users using observations, the Simulator Sickness Questionnaire, pre and post-test interviews. The user test results indicate that the home-based system would benefit from using a balance board with better sensitivity and that the system induced cybersickness in some test persons. User testing of the biomechatronics lab indicates that the embedded load sensors in the treadmill offered a better user experience because it was more sensitive and easier to operate. None of the participants in user testing of the biomechatronics lab experienced cybersickness. Although the participants were similar but not identical, we hypothesize that reason is that the bio-mechatronics lab reduces the mismatch between the visual and vestibular sense, as proposed in the Sensory Conflict theory (LaViola, 2000). Overall, the findings indicate that the proposed solutions are technically feasible, but further testing with patients and rehabilitation experts is needed to determine if this is transferable to future patients.
6.1 Proposed Future Work and Planned Activities
Developing a more affordable, simpler to use, compact, and overall, more comprehensible version of the current proof-of-concept biomechatronics lab is crucial in order to enable local stakeholders within the health sector to obtain and use the technology at their facilities. Moreover, home-based haptic device alternatives, consisting of just enough features to target the specific user groups instead of an all-round rig, may have a huge impact on the general progression in-home rehabilitation and training and should be looked further into. In the future, the biomechatronics lab will hopefully not only be used for rehabilitation of balance and gait problems in patients with cognitive impairment, but for physical therapy generally.
In future research, the systems should be tested on a larger sample size of healthy test participants before testing on patients. Before patient testing, the systems must be developed for safe testing. For the biomechatronics lab, adding sensors such as EMGs and ECGs could improve rehabilitation and give therapists valuable information about the patient.
New and improved hardware in terms of VR headsets and balance boards should be considered to increase usability and possibly reduce cybersickness. Oculus has released a new version of the Quest, called Oculus Quest 2. The Oculus Quest two is lighter, which causes less strain to the neck and head. It also has an improved refresh rate at 90 hz as well as improved resolution.13 Reconfiguring the Oculus Quest to function wirelessly in both the biomechatronics lab and home-based system would remove the need to have a PC, which is more cost-effective and convenient. The WBB is considered outdated, and could not connect via Bluetooth directly to the Quest. Improved hardware in terms of balance boards was not easily available, but there are some alternatives such as Sensamove’s Sensbalance MiniBoard14 which could be tested. Recent developments of portable motion platform rigs show promise for a home-based system.15 A further step towards a home-based solution would require studying remote assistance and observation solutions by rehabilitation experts.
The current VR scenario should be further developed, and additional scenarios should be created. Adding motivational elements, such as getting points, being timed and more engaging storylines, the overall VR experience could be improved. Adding sounds, music and interactions would make the experience even more immersive. The new scenarios should be developed in collaboration with both therapists and patients.
Data Availability Statement
The datasets presented in this study can be found in online repositories. The names of the repository/repositories and accession number(s) can be found below: https://gitlab.com/juliemad91/vr-rehabilitationhttps://uia.brage.unit.no/uia-xmlui/handle/11250/2682565https://uia.brage.unit.no/uia-xmlui/handle/11250/2680815
Author Contributions
JM and TM developed the theory concerning VR and rehabilitation and created the VR experience for the rehabilitation systems, designed and carried out the user testing under the supervision of FR and GI. MO and DJ created the theory for the biomechatronics lab at Top Research Centre Mechatronics, University of Agder (UiA). DJ contributed in the assemble of the physical proof-of-concept rig located at the lab under supervision of MO. The VR experience at the biomechatronics lab, comprised by visual and haptic rendering, was a result of collaboration between JM, TM, and DJ. JM, TM, and DJ wrote the manuscript with the support of FR, GI, and FS. MO conceived the original idea. FS contributed to the design of the work. FS, FR, and GI was involved in drafting the work and revising it critically. FS, FR, and GI has approved the version of the manuscript to be published. The project was supervised by FR, GI, FS, and MO.
Funding
This work is supported by the Top Research Centre Mechatronics, University of Agder (UiA), Jon Lilletuns vei 9, 4879, Grimstad, Norway.
Conflict of Interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Acknowledgments
We would like to give a special thanks to Associate Professor Sondre Sanden Tørdal for the development and implementation of a Python-based communication gateway between the biomechatronics lab and Unity. Thanks to Bård Kjetil Lien for his role as co-developer of the biomechatronics lab and co-writer of the affiliating Master’s thesis. We thank the University of Agder for facilitating this research through access to their technology and facilities. Thank you to Jan Christian Bjerke Strandene who helped us in the Motion LAB during development and testing of the biomechatronics lab.
Supplementary Material
The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/frvir.2021.645042/full#supplementary-material
Supplementary video 1 | The Biomechatronics Lab at the University of Agder.
Footnotes
1Integris (2018). A Guide to Different Types of Rehabilitation Therapy, https://integrisok.com/resources/on-your-health/2018/may/a-guide-to-different-types-of-rehabilitationtherapy.
2https://www.motekmedical.com/solution/caren
4https://www.boschrexroth.com/en/xc/industries/machinery-applications-and-engineering/motion-simulation-technology/products-and-solutions/6dof-motion-platform/emotion-1500/index
5https://www.qualisys.com/cameras/5-6-7/
6https://www.ni.com/en-no/shop/labview.html
7https://www.ni.com/en-no/shop/hardware/products/myrio-student-embedded-device.html
9Martindale, J. (2020). Oculus Quest vs. Oculus Rift. Library Catalog, https://www.digitaltrends.com/virtual-reality/oculus-quest-vs-oculus-rift/.
10Binstock, A. (2015). Powering the Rift, https://www.oculus.com/blog/poweringthe-rift/.
11Orland, K. (2016). The Ars review: Oculus Rift expands PC gaming past the monitors edge, https://arstechnica.com/gaming/2016/03/the-ars-review-oculus-rift-expandspc-gaming-past-the-monitors-edge/.
12https://github.com/lshachar/WiiBalanceWalker/releases/tag/v0.4
13https://www.oculus.com/compare/
14https://www.sensamove.com/en/sensbalance-miniboard/
15https://www.gforcefactory.com/edge-6d
References
Benzeroual, K., and Allison, R. S. (2013). “Cyber (Motion) Sickness in Active Stereoscopic 3D Gaming,” in 2013 International Conference on 3D Imaging, Liege, Belgium, Dec 3–Dec 5, 2013 (IEEE), 1–7.
Bonnechre, B., Jansen, B., Omelina, L., Rooze, M., and Van Sint Jan, S. (2015). Interchangeability of the Wii Balance Board for Bipedal Balance Assessment. JMIR Rehabil. Assist. Technol. 2, e8. doi:10.2196/rehab.3832
Borrego, A., Latorre, J., Alcaiz, M., and Llorens, R. (2018). Comparison of Oculus Rift and HTC Vive: Feasibility for Virtual Reality-Based Exploration, Navigation, Exergaming, and Rehabilitation. Games Health J. 7, 151–156. doi:10.1089/g4h.2017.0114
Bortone, I., Leonardis, D., Mastronicola, N., Crecchi, A., Bonfiglio, L., Procopio, C., et al. (2018). Wearable Haptics and Immersive Virtual Reality Rehabilitation Training in Children with Neuromotor Impairments. IEEE Trans. Neural Syst. Rehabil. Eng. 26, 1469–1478. doi:10.1109/TNSRE.2018.2846814
Brtsch, K., Schuler, T., Koenig, A., Zimmerli, L., Mrillat, S., Lnenburger, L., et al. (2010). Influence of Virtual Reality Soccer Game on Walking Performance in Robotic Assisted Gait Training for Children. J. Neuroeng Rehabil. 15, 9. doi:10.1186/1743-0003-7-15
Cano Porras, D., Sharon, H., Inzelberg, R., Ziv-Ner, Y., Zeilig, G., and Plotnik, M. (2019). Advanced Virtual Reality-Based Rehabilitation of Balance and Gait in Clinical Practice. Ther. Adv. Chronic Dis. 10, 1–16. doi:10.1177/2040622319868379
Cano Porras, D., Siemonsma, P., Inzelberg, R., Zeilig, G., and Plotnik, M. (2018). Advantages of Virtual Reality in the Rehabilitation of Balance and Gait: Systematic Review. Neurology 90, 1017–1025. doi:10.1212/WNL.0000000000005603
Chang, E., Kim, H. T., and Yoo, B. (2020). Virtual Reality Sickness: A Review of Causes and Measurements. Int. J. Hum. Comput. Interaction 36, 1658–1682. doi:10.1080/10447318.2020.1778351
Clark, R. A., Bryant, A. L., Pua, Y., McCrory, P., Bennell, K., and Hunt, M. (2010). Validity and Reliability of the Nintendo Wii Balance Board for Assessment of Standing Balance. Gait Posture 31, 307–310. doi:10.1016/j.gaitpost.2009.11.012
Darekar, A., McFadyen, B. J., Lamontagne, A., and Fung, J. (2015). Efficacy of Virtual Reality-Based Intervention on Balance and Mobility Disorders Post-stroke: A Scoping Review. J. Neuroeng. Rehabil. 12, 46. doi:10.1186/s12984-015-0035-3
Dasgupta, B., and Mruthyunjaya, T. S. (2000). The Stewart Platform Manipulator: A Review. Mechanism Machine Theor. 35, 15–40. doi:10.1016/S0094-114X(99)00006-3
Davis, S., Nesbitt, K., and Nalivaiko, E. (2014). “A Systematic Review of Cybersickness,” in Proceedings of the 2014 Conference on Interactive Entertainment - IE2014. Editors K. Blackmore, K. Nesbitt, and S. P. Smith (Newcastle, NSW, Australia: ACM Press), 1–9.
Dobkin, B. H. (2013). Wearable Motion Sensors to Continuously Measure Real-World Physical Activities. Curr. Opin. Neurol. 26, 602–608. doi:10.1097/WCO.0000000000000026
Gil-Gmez, J.-A., Llorns, R., Alcaiz, M., and Colomer, C. (2011). Effectiveness of a Wii Balance Board-Based System (eBaViR) for Balance Rehabilitation: a Pilot Randomized Clinical Trial in Patients with Acquired Brain Injury. J. Neuroeng. Rehabil. 8. doi:10.1186/1743-0003-8-30
Holden, M. K. (2005). Virtual Environments for Motor Rehabilitation: Review. CyberPsychol. Behav. 8, 187–211. doi:10.1089/cpb.2005.8.187
Howard, M. C. (2017). A Meta-Analysis and Systematic Literature Review of Virtual Reality Rehabilitation Programs. Comput. Hum. Behav. 70, 317–327. doi:10.1016/j.chb.2017.01.013
Jomås, D., and Lien, B. K. (2020). Computer-Based Environment for Lower Limb Neurorehabilitation and Assessment. Tech. Rep. Grimstad: University of Agder.
Kalron, A., Fonkatz, I., Frid, L., Baransi, H., and Achiron, A. (2016). The Effect of Balance Training on Postural Control in People with Multiple Sclerosis Using the CAREN Virtual Reality System: a Pilot Randomized Controlled Trial. J. Neuroeng. Rehabil. 13, 13. doi:10.1186/s12984-016-0124-y
Kennedy, R. S., Lane, N. E., Berbaum, K. S., and Lilienthal, M. G. (1993). Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness. Int. J. Aviation Psychol. 3. 203–220. doi:10.1207/s15327108ijap0303_3
Kern, F., Winter, C., Gall, D., Kathner, I., Pauli, P., and Latoschik, M. E. (2019). “Immersive Virtual Reality and Gamification Within Procedurally Generated Environments to Increase Motivation During Gait Rehabilitation,” in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, March 23–March 27, 2019 (IEEE), 500–509. doi:10.1109/VR.2019.8797828
Keshavarz, B., and Hecht, H. (2014). Pleasant Music as a Countermeasure against Visually Induced Motion Sickness. Appl. Ergon. 45, 521–527. doi:10.1016/j.apergo.2013.07.009
Keshavarz, B., Stelzmann, D., Paillard, A., and Hecht, H. (2015). Visually Induced Motion Sickness Can be Alleviated by Pleasant Odors. Exp. Brain Res. 233, 1353–1364. doi:10.1007/s00221-015-4209-9
Keshner, E. A., and Fung, J. (2017). The Quest to Apply VR Technology to Rehabilitation: Tribulations and Treasures. J. Vestib. Res. 27, 1–5. doi:10.3233/VES-170610
Kizony, R., Levin, M. F., Hughey, L., Perez, C., and Fung, J. (2010). Cognitive Load and Dual-Task Performance During Locomotion Poststroke: A Feasibility Study Using a Functional Virtual Environment. Phys. Ther. 90, 252–260. doi:10.2522/ptj.20090061
Kleim, J. A., and Jones, T. A. (2008). Principles of Experience-Dependent Neural Plasticity: Implications for Rehabilitation after Brain Damage. J. Speech, Lang. Hearing Res. 51, 225–239. doi:10.1044/1092-4388(2008/01810.1044/1092-4388(2008/018)
Koenig, S. T., Krch, D., Lange, B. S., and Rizzo, A. (2019). “Virtual Reality and Rehabilitation,” in Handbook of Rehabilitation Psychology. 3rd edn, Editors L. A. Brenner, S. A. Reid-Arndt, T. R. Elliott, R. G. Frank, and B. Caplan (Washington: American Psychological Association), 521521–539539.
Kourtesis, P., Collina, S., Doumas, L. A. A., and MacPherson, S. E. (2019). Technological Competence Is a Pre-condition for Effective Implementation of Virtual Reality Head Mounted Displays in Human Neuroscience: A Technological Review and Meta-Analysis. Front. Hum. Neurosci. 13, 342. doi:10.3389/fnhum.2019.00342
LaViola, J. J. (2000). A Discussion of Cybersickness in Virtual Environments. ACM SIGCHI Bull. 32, 47–56. doi:10.1145/333329.333344
Lawo, M., and Knackfß, P. (2018). Advanced Studies Mobile Research Center Bremen. Wiesbaden: Springer Fachmedien Wiesbaden.
Lazar, J., Feng, J. H., and Hochheiser, H. (2017). Research Methods in Human-Computer Interaction. 2nd edn. Cambridge, Massachusetts: Morgan Kaufmann.
Liao, W.-C., Lai, C.-L., Hsu, P.-S., Chen, K.-C., and Wang, C.-H. (2018). Different Weight Shift Trainings Can Improve the Balance Performance of Patients with a Chronic Stroke: A Randomized Controlled Trial. Medicine 97. doi:10.1097/md.0000000000013207
Liu, G., Geng, X., Liu, L., and Wang, Y. (2019). Haptic Based Teleoperation with Master-Slave Motion Mapping and Haptic Rendering for Space Exploration. Chin. J. Aeronautics 32, 723–736. doi:10.1016/j.cja.2018.07.009
Lubetzky, A. V., Kelly, J., Wang, Z., Gospodarek, M., Fu, G., Sutera, J., et al. (2020). Contextual Sensory Integration Training via Head Mounted Display for Individuals with Vestibular Disorders: a Feasibility Study. Disabil. Rehabil. Assistive Tech. 0, 1–11. doi:10.1080/17483107.2020.1765419
Madshaven, J. M., and Markseth, T. F. (2020). Investigating a Virtual Reality Solution for Rehabilitation in a Biomechatronics Lab and Home Environment. Master’s thesis. Grimstad: University of Agder.
Mansfield, A., and Inness, E. L. (2015). Force Plate Assessment of Quiet Standing Balance Control: Perspectives on Clinical Application within Stroke Rehabilitation. Rehabil. Process Outcome 4, S20363. doi:10.4137/RPO.S20363
Pareek, S., Chembrammel, P., and Kesavadas, T. (2018). “Development and Evaluation of Haptics-Based Rehabilitation System,” in 2018 International Symposium on Medical Robotics (ISMR), Atlanta, GA, United States, March 1–March 3, 2018 (IEEE), 1–6.
Plouzeau, J., Paillot, D., Chardonnet, J.-R., and Merienne, F. (2015). “Effect of Proprioceptive Vibrations on Simulator Sickness during Navigation Task in Virtual Environment,” in International Conference on Artificial Reality and Telexistence Eurographics Symposium on Virtual Environments, Kyoto, Japan, 1–6.
Proffitt, R., and Lange, B. (2015). The Feasibility of a Customized, In-Home, Game-Based Stroke Exercise Program Using the Microsoft Kinect Sensor. Int. J. Telerehab. 7, 23–34. doi:10.5195/IJT.2015.6177
Salisbury, K., Brock, D., Massie, T., Swarup, N., and Zilles, C. (1995). “Haptic Rendering: Programming Touch Interaction with Virtual Objects,” in Proceedings of the 1995 Symposium on Interactive 3D Graphics (Monterey, California, United States: Association for Computing Machinery), 123–130.
Sanfilippo, F., and Pacchierotti, C. (2020). “A Low-Cost Multi-Modal Auditory-Visual-Tactile Framework for Remote Touch,” in 2020 3rd International Conference on Information and Computer Technologies (ICICT), San Jose, CA, United States, March 9–March 12, 2020 (IEEE), 213–218.
Sanfilippo, F., and Pacchierotti, C. (2018). A Wearable Haptic System for the Health Monitoring of Elderly People in Smart Cities. Int. J. Onl. Eng. 14, 52–66. doi:10.3991/ijoe.v14i08.8571
Sanfilippo, F., Weustink, P. B., and Pettersen, K. Y. (2015). “A Coupling Library for the Force Dimension Haptic Devices and the 20-sim Modelling and Simulation Environment,” in IECON 2015 - 41st Annual Conference of the IEEE Industrial Electronics Society, Yokohama, Nov 9–Nov 12, 2015 (IEEE), 000168–000173.
Schiza, E., Matsangidou, M., Neokleous, K., and Pattichis, C. S. (2019). Virtual Reality Applications for Neurological Disease: A Review. Front. Robot. AI 6. doi:10.3389/frobt.2019.00100
Seo, N. J., Arun Kumar, J., Hur, P., Crocher, V., Motawar, B., and Lakshminarayanan, K. (2016). Usability Evaluation of Low-Cost Virtual Reality Hand and Arm Rehabilitation Games. J. Rehabil. Res. Dev. 53, 321–334. doi:10.1682/JRRD.2015.03.0045
Sessoms, P. H., Gottshall, K. R., Collins, J.-D., Markham, A. E., Service, K. A., and Reini, S. A. (2015). Improvements in Gait Speed and Weight Shift of Persons with Traumatic Brain Injury and Vestibular Dysfunction Using a Virtual Reality Computer-Assisted Rehabilitation Environment. Mil. Med. 180, 143–149. doi:10.7205/MILMED-D-14-00385
Shema, S. R., Brozgol, M., Dorfman, M., Maidan, I., Sharaby-Yeshayahu, L., Malik-Kozuch, H., et al. (2014). Clinical Experience Using a 5-Week Treadmill Training Program with Virtual Reality to Enhance Gait in an Ambulatory Physical Therapy Service. Phys. Ther. 94, 1319–1326. doi:10.2522/ptj.20130305
Stoek, J., Rudziska, M., Pustuka-Piwnik, U., and Szczudlik, A. (2016). The Effect of the Rehabilitation Program on Balance, Gait, Physical Performance and Trunk Rotation in Parkinson’s Disease. Aging Clin. Exp. Res. 28, 1169–1177. doi:10.1007/s40520-015-0506-1
Tokuyama, Y., Rajapakse, R. P. C. J., Miyazato, T., Konno, K., and Hung, Y.-P. (2019). “Development of a Puzzle-Box Game with Haptic Feedback,” in International Workshop on Advanced Image Technology (IWAIT) 2019 (Singapore: International Society for Optics and Photonics), 110490T.
Tuthill, J. C., and Azim, E. (2018). Proprioception. Curr. Biol. 28, R194–R203. doi:10.1016/j.cub.2018.01.064
Weiss, P. L., Naveh, Y., and Katz, N. (2003). Design and Testing of a Virtual Environment to Train Stroke Patients with Unilateral Spatial Neglect to Cross a Street Safely. Occup. Ther. Int. 10, 39–55. doi:10.1002/oti.176
Wienrich, C., Weidner, C. K., Schatto, C., Obremski, D., and Israel, J. H. (2018). “A Virtual Nose as a Rest-Frame - the Impact on Simulator Sickness and Game Experience,” in 2018 10th International Conference on Virtual Worlds and Games for Serious Applications (VS-Games), Wurzburg, Germany, Sep 5–Sep 7, 2018 (IEEE), 1–8.
Keywords: virtual reality, biomechatronics lab, physical rehabilitation, user experience, home system
Citation: Madshaven JM, Markseth TF, Jomås DB, Isabwe GMN, Ottestad M, Reichert F and Sanfilippo F (2021) Investigating the User Experience of Virtual Reality Rehabilitation Solution for Biomechatronics Laboratory and Home Environment. Front. Virtual Real. 2:645042. doi: 10.3389/frvir.2021.645042
Received: 22 December 2020; Accepted: 23 April 2021;
Published: 13 May 2021.
Edited by:
Thierry Duval, IMT Atlantique Bretagne-Pays de la Loire, FranceReviewed by:
Christophe Lohr, IMT Atlantique Bretagne-Pays de la Loire, FranceAnat Vilnai Lubetzky, New York University, United States
Copyright © 2021 Madshaven, Markseth, Jomås, Isabwe, Ottestad, Reichert and Sanfilippo. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Julie Madelen Madshaven, julie.madshaven@uia.no