- 1Centre for Pain Research, University of Bath, Bath, United Kingdom
- 2Orion Corporation, Research and Development, Espoo, Finland
- 3Healthware International (Nordics), Helsinki, Finland
- 4Department of Clinical- Experimental and Health Psychology, Ghent University, Ghent, Belgium
Introduction: Avatars are becoming more common in virtual reality, used as a guide, teacher, companion, or mentor through immersive experiences. Special attention needs to be paid to their design to ensure credibility and working alliance, to allow for the optimal delivery of behavior change content.
Methods: We present a new embodied Semi-Autonomous Mentoring Intelligence (SAMI) avatar used in an immersive virtual reality intervention for the self-management of chronic pain. We discuss the research findings that were taken into consideration and guided the design and development of SAMI, such methods to promote working alliance with non-human agents, optimal characteristics of non-human agents, and features of effective “automation”.
Conclusion: We provide a table of considerations and recommendations for researchers involved in designing future virtual reality characters. We provide suggestions on how future research could advance SAMI further for use in pain management and related interventions.
Introduction
Chronic pain is a major public health problem. At a population level, back pain alone was the leading global cause of years lived with disability in 2019 (Chen et al., 2021) and conservative estimates predict that 20% of people report disability and distress from untreated pain (Goldberg and McGee, 2011). This population burden is stable across countries (Goldberg and McGee, 2011), and is growing (Fayaz et al., 2016; Eccleston et al., 2018a). There are pharmacological interventions for chronic pain aimed primarily at analgesia which have had limited success (Varassi et al., 2010; Busse et al., 2018). Psychological interventions aimed at promoting self-management of pain and re-engagement in valued life activities (Williams et al., 2020; Fisher et al., 2021) are effective and have been adopted in guidelines as the minimum standard of care (Carville et al., 2021), however, their availability is scarce.
Digital therapeutics have emerged as a potential solution to the problem of scaling effective psychological interventions for chronic pain management (Keefe et al., 2012; Palanica et al., 2020; Trost et al., 2021). There is an opportunity to improve access to treatment by adopting information, communication, and sensing technologies, to improve the quality of those treatments, and to develop novel interventions (Eccleston et al., 2018b). Importantly, if some or all aspects of treatment can be automated, with a shift from an expert human to an expert automated system, then it may be possible to significantly increase the number of people who can be helped. Digital therapeutics can also be used in optimizing the treatment interventions by using novel data collection methods, advanced data analysis and loop-back techniques during and between the specific interventions (e.g., the speed or the difficulty of a conditional challenge), and deep learning. This approach can be also exploited when trying to find an optimal tone and content of communication between the expert automated system and a person using the system (e.g., Ewbank et al., 2020).
Automating advice, education, or therapy is not a new goal. There is a long history of attempts to create autonomous systems, starting with “Eliza” (Weizenbaum, 1966), and more recent attempts such as “A.L.I.C.E” (Wallace, 2009) and Sophia who were created by Hanson Robotics in 2016 (Retto, 2017). Automation is defined as “applications of robotics, artificial intelligence, machine learning, machine vision, and similar emerging and mature digital technologies that will allow human work to be substituted by computer capital” (Willis et al., 2019, p. 2). We know, however, of no attempts to create totally autonomous systems providing advice and instruction for the self-management of chronic pain.
In 2019 we developed a Virtual Reality (VR) intervention for adults (>17 years) with chronic low back pain and a high fear of movement, and we investigated its efficacy and safety in a clinical trial (Eccleston et al., 2022). In the fully immersive environment, patients were guided by a virtual mentor who provided health information and tasks to perform. Although the mentor was introduced as a person, he was lacking an avatar and the material was delivered in the form of audio and written text on screen. Patients reported high levels of satisfaction with the experience, whilst being aware that the mentor was both automatic and was delivered as voice only, text only, or both, so not embodied.
For the next version of this immersive environment and treatment, we aimed to improve the mentor function and appearance for the adults with chronic low back pain. In particular, we decided that the mentor should appear in a form in which the mentor is physically represented, including an animated humanoid form, rather than text or audio alone. From here we set out to design this Semi-Autonomous Mentoring Intelligence (SAMI), a non-human agent who is scripted to engage with a user and guide them on a pain management intervention. SAMI was semi-autonomous in that the mentor communicated a pre-selected script, but could not respond to spontaneous speech or communication from the patient. SAMI needed to be as credible, effective, and as persuasive as possible.
In the absence of a theoretical framework in which to help design SAMI, we conducted several literature reviews to understand aspects core to a positive therapeutic relationship which are summarized here. These fall into three key domains which emerged as particularly important in the design and development of a virtual mentor; i) characteristics and features to support the content, tone and optimal delivery of a pain management intervention, ii) features of effective “automation” in the delivery of expert advice, and iii) methods to promote working alliance with non-human agents.
In this paper we review the evidence for each of these domains and the decisions we took when designing SAMI. This paper is not intended to be an exhaustive description of how best to design a virtual mentor, but is a summary and explanation of some key aspects for researchers and developers to consider when developing virtual mentors in interventions, and is intended as a springboard for others to build upon.
Optimal Design Characteristics and Features of SAMI
Avatar use is still in its infancy in digital therapeutics. There are examples in the field of pain for the total (Martini et al., 2015) or partial (Hoffman, 2021) representation of the user as a human agent in VR. Outside of pain there are more mature developments on specific uses, such as a self-created avatar to embody the experience of auditory illusion (Alderson-day and Jones, 2018). When designing SAMI, an embodied mentor in a behavior change intervention, we elected to use a non-human avatar for two main reasons. First, in this version we were not able to allow users to choose or personalize SAMI and so, to reduce the likelihood of users finding the mentor irritating or distracting, a mentor with neutral features was necessary. Neutral features were most achievable with a non-human avatar. Second, we were also mindful of the danger of creating a “das unheimliche” (Freud, 2003) experience, known in robotics as an Uncanny Valley Effect (Mori, 1970) (UVE). The UVE describes when a robot or avatar become too human-like, such that an eeriness and sense of discomfort overtakes the sense of familiarity or comfort. Therefore, a non-human avatar was selected in order to reduce the likelihood of the UVE being invoked. Schwind et al. (2018) suggest that focusing on both facial expression and voice, and their congruency in emotion expression is key to avoiding any uncanny experiences (see also Tinwell et al., 2010; Tinwell et al., 2011; Tinwell et al., 2013). Synchronization is key to avoiding reality disruptions, particularly between avatar voice, facial expression, and bodily attitude. Schwind et al. (2018) also recommend using stylized aesthetics and evolved empathic features such as large eyes, symmetric faces, and iconic movements. Human voices are recommended over synthesized voices and keeping any lip movement in line with speech is considered important (Duffy and Pisoni, 1992; Costa et al., 2018). Taken together, we chose to create a stylized avatar with humanoid features (eyes, arms, hands) but with additional features to be used in a “mentor” role in an immersive environment. For example, the ability to move with 6 degrees of freedom in order to create humanlike movements with the aim of avoiding the UVE and increasing affinity and a working alliance with SAMI. Figures 1, 2 show the development process, and Figure 3 the final sketch for SAMI used in the experience.
SAMI was designed with 5 main emotional expressions which can be adapted to the verbal content used in mentoring. These emotional expressions were decided by group consensus of VR app designers and researchers in design and development meetings and after consideration of the literature on therapist traits and attitudes that are linked to a positive working alliance with patients. It was not possible in the development process for this iteration of SAMI to include an exhaustive range of emotions for SAMI to display, and therefore, the most important and common emotions were developed and incorporated into the intervention. Congruency between facial display, voice tone, and physical movements were coded into SAMI’s design to avoid disruption. Table 1 shows the emotional expressions that were used and this can be applied to any future therapeutic mentor.
Features of Effective “Automation” in the Delivery of Expert Advice
Automation in part or whole is a common goal in digital therapeutics. Automation gives the opportunity for scaling interventions, which brings with it benefits of reducing costs and errors (Haight and Caringi, 2007). There are excellent examples in VR showing how interventions can be delivered successfully, even in the absence of a trained therapist. Sometimes automated digital therapeutic solutions can improve on in-person outcomes. For example, Freeman et al. (2018) investigated the efficacy of an automated VR cognitive intervention for patients with height intolerance, which was guided by an avatar virtual coach (animated using the motion voice capture of an actor) and found that the intervention was highly effective for reducing patients’ fear and more effective than in-person interventions.
All content communicated by SAMI was scripted beforehand, creating a one-size-fits-all model. As discussed above, facial expressions and movements were concordant with the tone of the material. We designed SAMI to be female, as research shows when a preference for the gender of a therapist is given, it is significantly more likely to be for a female (Pikus and Heavey, 1996; Bhati, 2014; Liddon et al., 2018; Seidler et al., 2021). We elected to use a voice actor who is female and a native English speaker. Making SAMI uniform and consistent across all participants ensured that everyone received the same high-quality experience. The opportunities to personalize appearance, gender, and language are vast, and are possible with investment and advances in technology.
Automation has limitations. Critically, SAMI is unable to predict, plan for, avoid, or manage risk, which is why we have labelled SAMI as semi-autonomous. Risk needs to be managed outside of the experience (e.g., Abd-Alrazaq et al., 2020). Critically, users need to explicitly be made aware that the mentor is automated (Kretzschmar et al., 2019). However, technology is available that is able to measure the impact on the user (e.g., facial expression, success in cognition, measured amount of stress and/or pain, how many seconds the user is engaged With the avatar), which can inform an optimization loop-back. We were not able to utilize this technology in this version. SAMI was also only designed in one language and there was no chance for personalization in this version. SAMI had a limited set of responses, was scripted, and could not interpret and respond to the user in the program, nor could SAMI address the user by name. As highlighted above, future versions are likely to involve more options for the user to design a mentor they find more relatable, increasing working alliance.
Methods to Promote Working Alliance With SAMI in Pain Management
Important to engagement on any behavior change intervention, is the working alliance between user and mentor. Working alliance, also referred to as the therapeutic alliance, is the collaborative relationship between a mentor and client where there is a consensus and willingness on both sides to work together to achieve a positive outcome for the client (Bordin, 1979; Horvath and Greenberg, 1989). Strong working alliance has consistently been found to be a predictor of improved outcomes (Horvath et al., 2011; Fuertes et al., 2017). For example, a systematic review by Lakke and Meerman (2016) reported that when chronic musculoskeletal pain patients perceived a strong working alliance during treatment, this was associated with reduced pain severity and pain interference, and an improvement in physical functioning. Furthermore, Kinney et al. (2020) reported that the stronger the working alliance, the more likely patients were to see an improvement in pain outcomes for chronic musculoskeletal pain patients participating in physical therapy. For our purposes it is important to ensure that a working alliance can be built and maintained between SAMI and the user.
Working alliance can be formed between a human and a virtual agent or an online program, similar to that of working alliances between a human user and human mentor (Heim et al., 2018). Working alliance can be measured in digital therapeutics using validated measures such as the Virtual and Augmented Reality (WAI-VAR) (Miragall et al., 2015). In exploring patient-therapist relationships, Bordin (1979) established three core features of a successful alliance: (a) agreement on goals, (b) consensus on tasks that lead to goals, and (c) a clear or explicit bond by using a language of relationships (see also Horvath and Greenberg, 1989; Tryon et al., 2018). In the build process we adopted a principle of designing for optimal working alliance by checking the script for potential ruptures in alliance: every user had an opportunity to choose/rank goals from a wide selection of goals, which were reinforced by SAMI. SAMI encouraged each patient to re-assess goals regularly and a range of tasks were introduced to achieve the higher order goal. In addition to the emotional congruency and empathic content discussed above, non-verbal immediacy behaviors such as hand gestures, facial expressions, eye contact, nodding, smiling, and facing towards the patient were also optimized within available parameters (Tickle-Degnen and Gavett, 2003; Bickmore et al., 2005). Alternately, defensive stances such as crossed arms, asymmetrical arm postures and facing away from the user were never used (Pinto et al., 2012).
Discussion
We built an avatar for use in a VR immersive rehabilitation solution for adults with chronic pain who want to return to valued activity and improve function. SAMI was created, guided by three dominant considerations: optimization of non-human avatar features (including empathy and likeability), automation, and design for optimal working alliance. Table 2 summarizes the key features we considered important, but is not an exhaustive list of features and should be used as a starting point for future virtual mentor design used for therapeutic purposes, within the context and constraints of the intervention being designed or delivered. Age-relevant adjustments should also be considered; for example, we designed SAMI to be a virtual mentor for adults, but there may be child or adolescent-specific design features also important to consider.
Next generation pain management solutions will need to see a shift in expertise from individual practitioners to systems (Khirasaria et al., 2020; Rejula et al., 2021). To provide solutions at scale, some level of automation will be necessary. SAMI is described here as a case study in the attempt to design for scalability. Our next steps are empirical. First, we will explore the acceptability, usability, and tolerance for SAMI over a long rehabilitation time-period. Second, we will qualitatively explore working alliance and ruptures in working alliance, to reduce any dissonance. And finally, we will explore features of SAMI that can appear repetitive or “automatic” which can unhelpfully appear depersonalizing.
There is scope for SAMI to evolve as technology and research advances. The field needs a theoretical framework to aid the efficacy and effectiveness of virtual mentors, to outline which features are most important in developing and sustaining positive interactions with patients. Potential improvements are in making SAMI more attentive to the user by increasing opportunities for personalization. Further, we aim to explore natural language processing attempts to create opportunities for SAMI to be responsive to users, rather than following a script. And finally, we will collect naturalistic use data on the features of SAMI’s behavior to improve her credibility, likeability, and effectiveness in delivering or guiding users through an immersive experience.
Data Availability Statement
The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.
Author Contributions
JB conducted literature searches, wrote the first draft, edited, and approved for submission. EF edited the manuscript and approved for submission. JT edited the manuscript and approved for submission. SL edited the manuscript and approved for submission. MS edited the manuscript and approved for submission. CE conceptualised the manuscript, edited, and approved for submission.
Funding
This project was funded by Orion Corporation and has also received support from Business Finland, the Finnish government organization for innovation funding and trade, travel, and investment promotion.
Conflict of Interest
CE worked as a consultant to Orion Health in the development of these solutions. SL and MS is employed by Orion Corporation and JT is employed by Healthware International.
The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher’s Note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Acknowledgments
The authors thank colleagues at Orion Corporation, Research and Development, Espoo Finland and at Healthware International (Nordics), Helsinki, Finland for their contribution to the design and development of SAMI.
References
Abd-Alrazaq, A. A., Rababeh, A., Alajlani, M., Bewick, B. M., and Househ, M. (2020). Effectiveness and Safety of Using Chatbots to Improve Mental Health: Systematic Review and Meta-Analysis. J. Med. Internet Res. 22 (7), e16021. doi:10.2196/16021
Alderson-Day, B., and Jones, N. (2018). Understanding AVATAR Therapy: Who, or What, Is Changing? Lancet Psychiatry 5 (1), 2–3. doi:10.1016/s2215-0366(17)30471-6
Batta, A., Khirasaria, R., and Singh, V. (2020). Exploring Digital Therapeutics: The Next Paradigm of Modern Health-Care Industry. Perspect. Clin. Res. 11 (2), 54. doi:10.4103/picr.PICR_89_19
Bhati, K. S. (2014). Effect of Client-Therapist Gender Match on the Therapeutic Relationship: An Exploratory Analysis. Psychol. Rep. 115 (2), 565–583. doi:10.2466/21.02.PR0.115c23z1
Bickmore, T., Gruber, A., and Picard, R. (2005). Establishing the Computer-Patient Working Alliance in Automated Health Behavior Change Interventions. Patient Educ. Couns. 59 (1), 21–30. doi:10.1016/j.pec.2004.09.008
Bordin, E. S. (1979). The Generalizability of the Psychoanalytic Concept of the Working Alliance. Psychotherapy Theory, Res. Pract. 16, 252–260. doi:10.1037/h0085885
Busse, J. W., Wang, L., Kamaleldin, M., Craigie, S., Riva, J. J., and Montoya, L. (2018). Opioids for Chronic Noncancer Pain: A Systematic Review and Meta-Analysis. JAMA 320 (23), 2448-2460. doi:10.1001/jama.2018.18472
Carville, S., ConstantiKosky, M. N., Stannard, C., and Wilkinson, C. (2021). Chronic Pain (Primary and Secondary) in Over 16s: Summary of NICE Guidance. BMJ 373, 895. doi:10.1136/bmj.n895
Chen, S., Chen, M., Wu, X., Lin, S., Tao, C., Cao, H., et al. (2022). Global, Regional and National Burden of Low Back Pain 1990-2019: A Systematic Analysis of the Global Burden of Disease Study 2019. J. Orthop. Transl. 32, 49–58. doi:10.1016/j.jot.2021.07.005
Costa, S., Brunete, A., Bae, B.-C., and Mavridis, N. (2018). Emotional Storytelling Using Virtual and Robotic Agents. Int. J. Hum. Robot. 15 (3), 1850006. doi:10.1142/S0219843618500068
Duffy, S. A., and Pisoni, D. B. (1992). Comprehension of Synthetic Speech Produced by Rule: A Review and Theoretical Interpretation. Lang. Speech 35 (4), 351–389. doi:10.1177/002383099203500401
Eccleston, C., Fisher, E., Liikkanen, S., Sarapohja, T., Stenfors, C., Jääskeläinen, S. K., et al. (2022). A Prospective, Double-Blind, Pilot Randomized Controlled Trial of an ‘embodied’ Virtual Reality Intervention for Chronic Low Back Pain in Adults. Pain [in press]. doi:10.1097/j.pain.0000000000002617
Eccleston, C., Tabor, A., and Keogh, E. (2018b). In Using Advanced Technologies to Improve Access to Treatment, to Improve Treatment, and to Directly Alter Experience” in Psychological Approaches to Pain Management. Editors D. C. Turk, and R. J. Gatchel. 3rd Edition (NY: Guilford Press), 289–301.
Eccleston, C., Wells, C., and Morlion, B. (2018a). European Pain Management. Oxford: Oxford University Press.
Ewbank, M. P., Cummins, R., Tablan, V., Bateup, S., Catarino, A., Martin, A. J., et al. (2020). Quantifying the Association Between Psychotherapy Content and Clinical Outcomes Using Deep Learning. JAMA Psychiatry 77 (1), 35–43. doi:10.1001/jamapsychiatry.2019.2664
Fayaz, A., Croft, P., Langford, R. M., Donaldson, L. J., and Jones, G. T. (2016). Prevalence of Chronic Pain in the UK: A Systematic Review and Meta-Analysis of Population Studies. BMJ Open 6, e010364. doi:10.1136/bmjopen-2015-010364
Fisher, E., Villanueva, G., Henschke, N., Nevitt, S. J., Zempsky, W., Probyn, K., et al. (2021). Efficacy and Safety of Pharmacological, Physical, and Psychological Interventions for the Management of Chronic Pain in Children: A WHO Systematic Review and Meta-Analysis. Pain 163 (1), e1–e19. doi:10.1097/j.pain.0000000000002297
Freeman, D., Haselton, P., Freeman, J., Spanlang, B., Kishore, S., Albery, E., et al. (2018). Automated Psychological Therapy Using Immersive Virtual Reality for Treatment of Fear of Heights: A Single-Blind, Parallel-Group, Randomised Controlled Trial. Lancet Psychiatry 5 (8), 625–632. doi:10.1016/S2215-0366(18)30226-8
Freud, S. S. (2003). “The Uncanny,” in The Uncann. Ytransl: D. McLintock. Editor S. Freud (NY: Penguin Classics), 121–162.
Fuertes, J. N., Toporovsky, A., Reyes, M., and Osborne, J. B. (2017). The Physician-Patient Working Alliance: Theory, Research, and Future Possibilities. Patient Educ. Couns. 100 (4), 610–615. doi:10.1016/j.pec.2016.10.018
Goldberg, D. S., and McGee, S. J. (2011). Pain as a Global Public Health Priority. BMC Public Health 11 (1), 1–5. doi:10.1186/1471-2458-11-770
Haight, J. M., and Caringi, R. G. (2007). Automation vs. Human Intervention: What Is the Best Mix for Optimum System Performance? A Case Study. Ijram 7 (5), 708. doi:10.1504/IJRAM.2007.014095
Heim, E., Rötger, A., Lorenz, N., and Maercker, A. (2018). Working Alliance with an Avatar: How Far Can We Go with Internet Interventions? Internet Interv. 11, 41–46. doi:10.1016/j.invent.2018.01.005
Hoffman, H. G. (2021). Interacting with Virtual Objects via Embodied Avatar Hands Reduces Pain Intensity and Diverts Attention. Sci. Rep-uk 11 (1), 1–13. doi:10.1038/s41598-021-89526-4
Horvath, A. O., Del Re, A. C., Flückiger, C., and Symonds, D. (2011). Alliance in Individual Psychotherapy. Psychotherapy 48 (1), 9–16. doi:10.1037/a0022186
Horvath, A. O., and Greenberg, L. S. (1989). Development and Validation of the Working Alliance Inventory. J. Couns. Psychol. 36, 223–233. doi:10.1037/0022-0167.36.2.223
Keefe, F. J., Huling, D. A., Coggins, M. J., Keefe, D. F., Rosenthal, Z. M., Herr, N. R., et al. (2012). Virtual Reality for Persistent Pain: A New Direction for Behavioral Pain Management. Pain 153 (11), 2163–2166. doi:10.1016/j.pain.2012.05.030
Kinney, M., Seider, J., Beaty, A. F., Coughlin, K., Dyal, M., and Clewley, D. (2020). The Impact of Therapeutic Alliance in Physical Therapy for Chronic Musculoskeletal Pain: A Systematic Review of the Literature. Physiother. Theory Pract. 36 (8), 886–898. doi:10.1080/09593985.2018.1516015
Kretzschmar, K., Tyroll, H., Pavarini, G., Manzini, A., and Singh, I. (2019). Can Your Phone Be Your Therapist? Young People's Ethical Perspectives on the Use of Fully Automated Conversational Agents (Chatbots) in Mental Health Support. Biomed. Inf. Insights 11, 117822261982908–117822261982909. doi:10.1177/1178222619829083
Lakke, S. E., and Meerman, S. (2016). Does Working Alliance Have an Influence on Pain and Physical Functioning in Patients with Chronic Musculoskeletal Pain; a Systematic Review. J. Compassionate Health Care 3, 1–10. doi:10.1186/s40639-016-0018-7
Liddon, L., Kingerlee, R., and Barry, J. A. (2018). Gender Differences in Preferences for Psychological Treatment, Coping Strategies, and Triggers to Help-Seeking. Br. J. Clin. Psychol. 57 (1), 42–58. doi:10.1111/bjc.12147
Martini, M., Kilteni, K., Maselli, A., and Sanchez-Vives, M. V. (2015). The Body Fades Away: Investigating the Effects of Transparency of an Embodied Virtual Body on Pain Threshold and Body Ownership. Sci. Rep. 5, 13948. doi:10.1038/srep13948
Miragall, M., Baños, R. M., Cebolla, A., and Botella, C. (2015). Working Alliance Inventory Applied to Virtual and Augmented Reality (WAI-VAR): Psychometrics and Therapeutic Outcomes. Front. Psychol. 6, 1531. doi:10.3389/fpsyg.2015.01531
Palanica, A., Docktor, M. J., Lieberman, M., and Fossat, Y. (2020). The Need for Artificial Intelligence in Digital Therapeutics. Digit. Biomark 4 (1), 21–25. doi:10.1159/000506861
Pikus, C. F., and Heavey, C. L. (1996). Client Preferences for Therapist Gender. J. Coll. Student Psychotherapy 10, 35–43. doi:10.1300/J035v10n04_05
Pinto, R. Z., Ferreira, M. L., Oliveira, V. C., Franco, M. R., Adams, R., Maher, C. G., et al. (2012). Patient-Centred Communication Is Associated with Positive Therapeutic Alliance: A Systematic Review. J. Physiother. 58 (2), 77–87. doi:10.1016/S1836-9553(12)70087-5
Rejula, V., Anitha, J., Belfin, R. V., and Peter, J. D. (2021). Chronic Pain Treatment and Digital Health Era-An Opinion. Front. Public Health 9, 779328. doi:10.3389/fpubh.2021.779328
Retto, J. (2017). Sophia, First Citizen Robot of the World. ResearchGate. Available at: https://www.researchgate.net/publication/321319964_SOPHIA_FIRST_CITIZEN_ROBOT_OF_THE_WORLD. (Accessed February 16, 2022).
Schwind, V., Wolf, K., and Henze, N. (2018). Avoiding the Uncanny Valley in Virtual Character Design. Interactions 25, 45–49. doi:10.1145/3236673
Seidler, Z. E., Wilson, M. J., Kealy, D., Oliffe, J. L., Ogrodniczuk, J. S., and Rice, S. M. (2021). Men's Preferences for Therapist Gender: Predictors and Impact on Satisfaction with Therapy. Couns. Psychol. Q. 35 (1), 173–189. doi:10.1080/09515070.2021.1940866
Takashima, K., Omori, Y., Yoshimoto, Y., Itoh, Y., Kitamura, Y., and Kishino, F. (2008). Effects of Avatar’s Blinking Animation on Person Impressions. Proceedings of Graphics Interface. Couns. Psychol. Q., 169–176.
Tickle-Degnen, L., and Gavett, E. (2003). “Changes in Nonverbal Behavior During the Development of Therapeutic Relationships,” in Nonverbal Behavior in Clinical Settings. Editors P. Philippot, R. S. Feldman, and E. J. Coats (NY: Oxford University Press), 75–110. doi:10.1093/med:psych/9780195141092.003.0004
Tinwell, A., Grimshaw, M., Nabi, D. A., and Williams, A. (2011). Facial Expression of Emotion and Perception of the Uncanny Valley in Virtual Characters. Comput. Hum. Behav. 27 (2), 741–749. doi:10.1016/j.chb.2010.10.018
Tinwell, A., Grimshaw, M., and Williams, A. (2010). Uncanny Behaviour in Survival Horror Games. J. Gaming & Virtual Worlds 2 (1), 3–25. doi:10.1386/jgvw.2.1.3_1
Tinwell, A., Nabi, D. A., and Charlton, J. P. (2013). Perception of Psychopathy and the Uncanny Valley in Virtual Characters. Comput. Hum. Behav. 29, 1617–1625. doi:10.1016/j.chb.2013.01.008
Trost, Z., France, C., Anam, M., and Shum, C. (2021). Virtual Reality Approaches to Pain: Toward a State of the Science. Pain 162 (2), 325–331. doi:10.1097/j.pain.0000000000002060
Tryon, G. S., Birch, S. E., and Verkuilen, J. (2018). Meta-Analyses of the Relation of Goal Consensus and Collaboration to Psychotherapy Outcome. Psychotherapy 55 (4), 372–383. doi:10.1037/pst0000170
Varrassi, G., Müller-Schwefe, G., Pergolizzi, J., Orónska, A., Morlion, B., and Mavrocordatos, P. (2010). Pharmacological Treatment of Chronic Pain - The Need for CHANGE. Curr. Med. Res. Opin. 26 (5), 1231-1245. doi:10.1001/jama.2018.18472
Wallace, R. S. (2009). “The Anatomy of A.L.I.C.E,” in Parsing the Turing Test. Editors R. Epstein, G. Roberts, and G. Beber (Dordrecht: Springer), 181–210. doi:10.1007/978-1-4020-6710-5_13
Weizenbaum, J. (1966). ELIZA-a Computer Program for the Study of Natural Language Communication Between Man and Machine. Commun. ACM 9 (1), 36–45. doi:10.1145/365153.365168
Williams, A. C. d. C., Fisher, E., Hearn, L., and Eccleston, C. (2020). Psychological Therapies for the Management of Chronic Pain (Excluding Headache) in Adults. Cochrane DB Syst. Rev. 2021. doi:10.1002/14651858.CD007407.pub4
Keywords: pain, autonomous agent, mentor, rehabilitation, virtual reality
Citation: Bartlett J, Fisher E, Liikkanen S, Turunen J, Skog M and Eccleston C (2022) The Design and Development of an Embodied Semi-Autonomous Mentoring Intelligence (SAMI) for Use in Virtual Reality Interventions, Operationalized for the Self-Management of Chronic Pain. Front. Virtual Real. 3:882980. doi: 10.3389/frvir.2022.882980
Received: 24 February 2022; Accepted: 07 June 2022;
Published: 07 July 2022.
Edited by:
Denis Martin, Teesside University, United KingdomReviewed by:
Pat Schofield, University of Plymouth, United KingdomMargaret Dunham, Edinburgh Napier University, United Kingdom
Copyright © 2022 Bartlett, Fisher, Liikkanen, Turunen, Skog and Eccleston. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: C. Eccleston, Yy5lY2NsZXN0b25AYmF0aC5hYy51aw==