Skip to main content

ORIGINAL RESEARCH article

Front. Comput. Sci., 18 June 2021
Sec. Human-Media Interaction
This article is part of the Research Topic Designing Technology for Emotions to Improve Mental Health and Wellbeing View all 5 articles

Is Fun For Wellness Engaging? Evaluation of User Experience of an Online Intervention to Promote Well-Being and Physical Activity

Michael P. Scarpa
Michael P. Scarpa1*Isaac PrilletenskyIsaac Prilletensky1Adam McMahonAdam McMahon1Nicholas D. MyersNicholas D. Myers2Ora PrilleltenskyOra Prilleltensky1Seungmin LeeSeungmin Lee2Karin A. PfeifferKarin A. Pfeiffer2Andr G. BatemanAndré G. Bateman2Ahnalee M. BrincksAhnalee M. Brincks3
  • 1School of Education and Human Development, University of Miami, Coral Gables, FL, United States
  • 2Department of Kinesiology, Michigan State University, East Lansing, MI, United States
  • 3School of Human Development and Family Studies, Michigan State University, East Lansing, MI, United States

Online well-being interventions demonstrate great promise in terms of both engagement and outcomes. Fun For Wellness (FFW) is a novel online intervention grounded in self-efficacy theory and intended to improve multidimensional well-being and physical activity through multi-modal methods. These strategies include capability-enhancing opportunities, learning experiences such as games, video vignettes, and self-assessments. RCT studies have suggested that FFW is efficacious in improving subjective and domain-specific well-being, and effective in improving mental health, physical health, physical activity, and self-efficacy in United States. adults who are overweight and in the general population. The present study uses qualitative and quantitative user experience data collected during two RCT trials to understand and evaluate engagement with FFW, its drivers, and its outcomes. Results suggest that FFW is enjoyable, moderately engaging, and easy to use; and contributes to positive outcomes including skill development and enhanced confidence, for both overweight individuals and the general adult population. Drivers of engagement appear to include rewards, gamification, scenario-based learning, visual tracking for self-monitoring, ease of use and simple communications, and the entertaining, interactive nature of program activities. Findings indicate that there are opportunities to streamline and simplify the experience. These results can help improve FFW and contribute to the science of engagement with online interventions designed to improve well-being.

Introduction

The literature suggests numerous benefits of online intervention delivery. First, an increasing number of people already use the internet regularly in their daily lives, and technological advances have made it easy to reach them in a familiar context (Lewis et al., 2017). Additionally, online interventions have the potential to be more cost-effective than in-person interventions, especially when distributed broadly (Lobban et al., 2019). Online programs also provide opportunities for intelligent, responsive design which tailors content to the needs of individuals and cultures (Norman et al., 2007). Further, in a pandemic context, online interventions offer the ability to engage participants when physical interaction is not possible (Wright and Caudill, 2020). Because of these advantages, investigating online interventions for the enhancement of well-being is very worthwhile.

Well-being is best understood as multidimensional (Decancq and Lugo, 2013; Prilleltensky et al., 2015). In line with this understanding, evidence suggests that online interventions may be effective for the promotion of well-being across domains and outcomes. From a physical well-being perspective, one recent review found significant improvements in diet, physical activity, adiposity, tobacco use, and excess alcohol use associated with internet interventions (Afshin et al., 2016). Others have found evidence of support for improvements in physical activity (Vandelanotte et al., 2007; Joseph et al., 2014), weight loss (Martin et al., 2015), and pain management (Palermo et al., 2009). In the psychological sphere, humor-based online interventions using positive psychology principles (Wellenzohn et al., 2016) have been found to increase happiness and attenuate depression. Extending well-being to the interpersonal domain, studies of online interventions have found benefits of increased social support and decreased isolation (Barry et al., 2018; Nicholas et al., 2012).

In the occupational domain, online interventions have been suggested as a cost-effective and potentially efficacious approach to reducing workplace stress (Aikens et al., 2014; Ebert et al., 2014), improving cardiovascular fitness among employees (Ahtinen et al., 2013; Aneni et al., 2014), and developing resilience and engagement among students (Mueller et al., 2018). Some studies have also investigated outcomes spanning multiple well-being domains. For example, a 2017 study of an online positive psychology intervention using the PERMA profiler found increases in measures of both psychological well-being and occupational well-being (Neumeier et al., 2017). Further, a 2014 study which operationalized well-being in terms of healthy behaviors, emotional health, and positive work environment found that both well-being and social connection increased alongside engagement with an online intervention (Cobb and Poirier, 2014).

Despite this promise, online interventions pose unique challenges. Attrition is an important concern, with some evidence suggesting systematic—yet understudied—differences in attrition patterns between online and in-person intervention (Nicholas et al., 2010; Bouwmen et al., 2019; Linardon and Fuller-Tyszkiewicz, 2020). Further, because internet interventions are often accessed without external guidance, users may struggle to navigate all the contents by themselves or access the most relevant components (Barak and Grohol, 2011; Chou et al., 2017).

Given these challenges, researchers have suggested several approaches for improving online interventions. One approach is to include interactive and gamified elements, theorized to increase motivation for behavior change (Baranowski et al., 2008). The promise in this approach was borne out in a 2016 review demonstrating effects in over half of interventions reviewed, with the strongest effects for physical activity and the weakest effect for cognitions (Johnson et al., 2016). Another success factor is personalization of intervention content, which was shown to increase engagement in online weight-loss interventions (Brindal et al., 2012). Third, professionalism and simplicity of presentation may encourage participation (Brouwer et al., 2009). Finally, Khaylis and colleagues (2010) found encouraging weight loss outcomes associated with the inclusion of feedback, self-monitoring, structure, individual tailoring, and social support in online interventions; these components may hold promise for other outcomes as well. Each of these techniques is thought to improve outcomes by enhancing user engagement with interventions.

User Engagement With Online Interventions

Engagement refers to the quantity and quality of user interaction and experience with an intervention, and is related to utilization concepts such as dosage, adherence, and exposure (Couper et al., 2010; Taki et al., 2017). While some authors conceive of engagement as simple usage (e.g., Maher et al., 2014), others distinguish mere activity from effective interaction with program elements directly related to outcomes (Yardley et al., 2016). Others have gone so far as to operationalize engagement in terms of multiple psychological and behavioral constructs such as interest, attention, cognitive absorption, and presence (Short et al., 2018); or in behavioral, cognitive, and affective terms (Kelders, 2019). Engagement is thought to be an indispensable driver of program utilization and results (Yardley et al., 2016). As such, it has been suggested that behavioral variables related to engagement are critical to understanding program outcomes (Schwarzer and Satow, 2012).

Though newly developing compared to the study of feasibility and effectiveness (Morrison and Doherty, 2014), the literature on engagement has produced several useful frameworks. For example, Short et al. (2015) suggest that engagement be characterized as the interaction of environmental factors, user attributes, and intervention design elements. Similarly, Yardley and colleagues (2010) suggest a distinction between a digital perspective focused on software usability and a behavior change perspective, focused on achievement of physiological, cognitive, and behavioral outcomes. Both models underscore the complex, multifaceted nature of engagement with online interventions.

Diverse delivery and design factors have shown promise as drivers of engagement. First, push reminders, or reminders directly to the inbox, mailbox, or mobile device of participants, have been shown to increase engagement and outcomes in multiple studies (e.g., Bennett and Glasgow, 2009). Persuasive technology such as computer-mediated encouragement (Kelders, 2019), incentives (Schubart et al., 2011), and self-monitoring (Bennett and Glasgow, 2009) have also been recommended. Finally, tailoring or personalization (Couper et al., 2010) and gamification (Kelders, 2019), discussed above as drivers of outcomes, have also been identified as important contributors to engagement.

Other promising tools and techniques fall under the rubric of social influence. Researchers have found evidence that features such as displaying progress to friends in the program and creating opportunities for participants to provide support to one another are powerful and cost-effective drivers of engagement across various types of intervention touchpoint (Poirier and Cobb, 2012). Others have found engagement benefits associated with the use of social networking sites, prompting recommendations that interventions be designed with similar features such as social sharing, access to supportive communities, and integrated tools for visible personal tracking (Chung et al., 2017). Where resources allow, dedicated guidance and support (which may be combined with automated digital features) have also enhanced engagement with various interventions (Doherty et al., 2012; Yardley et al., 2016).

Across all these methods, however, and in line with the complexity discussed above, researchers caution that there is substantial diversity in the “who” and “how” of intervention access. As such, providing users a limited but varied set of options, especially in combination with personalization and tailoring (Kelders, 2019), may be a more fruitful approach than searching for singular, universal best practices (Morrison and Doherty, 2014).

Assessing Engagement

Despite the growing acknowledgement of the key role engagement plays in producing intervention outcomes, consensus towards the conceptualization and assessment of engagement has been slow to emerge (Baltierra et al., 2016; Perski et al., 2017). Numerous models have been put forward. Some researchers have favored objective activity and interaction metrics, including software paradata (Couper et al., 2010), taken to indicate depth and breadth of utilization (e.g., Arnold et al., 2019). Others have drawn from an understanding of engagement as involving an emotional and cognitive response to intervention activities. These researchers have generally evaluated engagement in terms of self-reported affective (e.g., satisfaction, enjoyment, anxiety), decisional (e.g., loyalty, intent to use), and hybrid (e.g., immersion, flow) constructs. Still others have aligned the measurement of engagement with participant-rated intervention attributes such as credibility, involvement, and usability (e.g., Craig-Lefebvre et al., 2010; McClure et al., 2013).

While no standardized approach has achieved preeminence, each approach aligns with one or more facets or traditions of engagement. Some authors have suggested that a combined approach which integrates both utilization behavior and subjective experience measures may hold promise as a unifying conceptual framework (Perski et al., 2017). At the same time, it has also been put forward that engagement measurement strategies should be specific to the aims and methods of the intervention (Young et al., 2020). In line with this understanding, below we discuss key aspects of the Fun for Wellness intervention intended to drive outcomes by way of increasing engagement, conceptualized in terms of both completion of intervention activities and enhanced interest, cognition, and affect.

Fun for Wellness Intervention

Fun for Wellness (FFW) is an online intervention designed to promote well-being in six domains of life we call I COPPE (Interpersonal, Community, Occupational, Physical, Psychological, and Economic), and to increase physical activity. In addition, it aims to increase self-efficacy (Figure 1). The intervention consists of 152 activities or challenges, requiring about 12 h total to complete. Each challenge lasts anywhere between one to four minutes, and they include video games, case studies with professional actors, mini-coaching sessions, reflection exercises, and humor breaks. Participants were required to complete four initial challenges to access the remaining 148 activities. The first four challenges oriented the user to the program and to the characters presented in the various vignettes. Computer software tracked how many challenges each participant completed. Each challenge was assigned a participation score depending on duration and content (Myers et al., 2017a). Participants had to accumulate at least 21 points to reach the threshold of “engagement.”

FIGURE 1
www.frontiersin.org

FIGURE 1. The Fun For Wellness conceptual model for the promotion of self-efficacy, subjective well-being and physical activity.

Prior to starting the program participants are asked to complete the I COPPE self-assessment (Figure 2A). Given that scenario-based learning is superior to didactic methods (Irvine et al., 2015), and skill building is more effective than providing information (Conley et al., 2015), FFW focuses on multiple case studies and the provision of competencies. The program teaches participants skills related to seven drivers of change we call BET I CAN (Behaviors, Emotions, Thoughts, Interactions, Context, Awareness, and Next steps). Each BET I CAN driver of change is translated into action through a couple of skills. Figure 2B presents a list of all the skills taught under each driver. In general, the BET I CAN model builds on proven methods of change, including cognitive behavioral theory, positive psychology, counseling psychology, behavioral economics, contemplative practices, narrative therapy, psychoanalytic methods, and self-directed change (Myers et al., 2017a).

FIGURE 2
www.frontiersin.org

FIGURE 2. FFW sample pages (A)Example challenge: Interactive well-being assessment (I COPPE)(B)Progress indicators and links to challenge sets(C)Links to challenges with guidance and achievement indicators.

Two reasons account for integrating various approaches to change in FFW: First, some people relate better to certain techniques over others, and second, in a large sample people are bound to be in different stages of change that require different interventions. For some people who are in the precontemplation stage of change, raising awareness of the problem is more useful than jumping into action. For others in the action stage, making a plan and making it stick is more beneficial than learning about the issue (Norcross, 2012).

Participants were incentivized for completing a battery of assessments, including the I COPPE survey and others. Upon completion of assessments at baseline, participants were given a $10 Amazon electronic gift card. Upon completion of the assessments at the end of the program at 30 days, participants received an additional $15. Participants completing the final assessment after 30 days of completing the program were given an additional $5 in Amazon cards. For more details on the intervention, please see Myers et al. (2019).

Fun for Wellness Outcomes

FFW was designed 1) to improve well-being in all the I COPPE domains of life, 2) to increase physical activity, and 3) to enhance the self-efficacy of participants with respect to subjective well-being and physical activity (Figure 1). Challenges were constructed to meet one or more of these three goals. Figure 2C shows the menu of activities designed to help participants set a goal, which is a crucial skill in improving subjective well-being and physical activity. Each challenge listed in Figure 2C is designed to teach participants the essential aspects of effective goal setting. This method was used for teaching each of the 14 skills listed in Figure B. We used multiple educational modalities to keep the user engaged.

The intervention has been the subject of two randomized controlled trials (RCT) in which its efficacy and effectiveness have been demonstrated. In a RCT with healthy adults, FFW was instrumental in fostering interpersonal, psychological, community, and economic assessments of well-being (Myers et al., 2017b). FFW was also helpful in generating actions to promote well-being in the physical and interpersonal domains of well-being (Myers, Dietz, et al., 2018). Participants in the study reported eating more fruits and vegetables, exercising more, and investing more in relationships. FFW also increased self-efficacy (Myers et al., 2017a). In a second RCT FFW demonstrated that participants increased their self-efficacy in the domain of physical health. This, in turn, increased the physical activity levels they reported (Myers et al., 2020). Furthermore, that study showed that FFW improved community, physical, psychological, and occupational wellness (Myers et al., 2021). Finally, participants reported improvements in scores of physical and mental health status (Prilleltensky et al., 2020).

During the 2019 study, the number of challenges completed by each participant was tracked to measure engagement. According to this measure, chosen as an indicator of active utilization in contrast to the more passive time-spent measures, 81.9% of respondents demonstrated engagement with FFW (defined as accumulating 21 activity points). On average, participants completed 57.4 challenges, suggesting a reasonable degree of behavioral engagement.

The results outlined below provide details concerning the subjective (i.e., cognitive and affective) and behavioral experiences of participants. While evidence has been provided regarding the positive outcomes associated with usage of FFW in the references above, limited documentation regarding participant engagement with the program has been published. This is the first study to provide empirical evidence concerning affective and cognitive engagement with FFW. Since innovative programs must be submitted not only to outcome but also to process evaluations, we thought it was necessary to learn more about user engagement. This will facilitate FFW program improvement and refinement and may yield data that is of value in the development of other interventions.

Research Objective

Because user experience is a prerequisite for engagement, the objective of the study was to evaluate user experience and engagement with the FFW platform through the collection of quantitative and qualitative data. A better understanding of user experience and engagement with FFW can inform further developments and changes required to make it more engaging and effective.

Methods

Ethics Approval

The data reported in this article were collected in two studies. Both studies adhered to the ethical standards of the universities involved and with the 1964 Helsinki declaration and its later amendments of comparable ethical standards. The first study received approval by the (institution masked for review) Institutional Review Board (IRB no. 20150237). The second study, which was a collaboration between (institutions masked for review), received approval from their respective ethical review boards (UM IRB no. 20170541 and MSU IRB no. 00000979).

Study Design

Data for the present analysis was taken from follow-up surveys conducted with participants in two RCTs of the Fun for Wellness online intervention. These surveys were conducted in 2015 and 2019, respectively, as an integrated part of the FFW experience. The first study recruited participants from a research university in the Southeast of the United States. The second study utilized a national sample of participants recruited through the use of a panel company. Participants were randomly assigned to either the FFW intervention or a control group. Randomization was done by the software. Once participants signed the electronic consent form, they were directed to complete a battery of assessments including the I COPPE survey, self-efficacy measures, and in the case of the second RCT, a number of physical activity measures as well as the SF-36v2 Health Survey, which is an index of physical and mental health. Participants were then instructed to engage with the platform for 30 days. The program was available 24/7. At the end of 30 days, participants were instructed to complete the same battery as in the baseline, plus a user experience questionnaire (Table 1). Finally, the battery administered at baseline was completed once more at 60 days. For full details regarding measurement instruments kindly see Myers et al. (2019); and Prilleltensky et al. (2020).

TABLE 1
www.frontiersin.org

TABLE 1. Agreement by survey item, 2015 and 2019.

Participants

Recruitment, screening, and consent for both studies was conducted online for adults over the age of 18 who would be able to access and understand an English-language, online intervention. In the 2015 study, 500 participants were consented and randomly assigned to either a control group or FFW intervention. In this manner, 237 participants were assigned to the FFW group and completed the survey which forms the basis of the present analysis. These participants were prompted to focus on general well-being. In the 2019 study, 900 participants were consented and assigned randomly, with 410 being assigned to the FFW condition and completing the experience survey. Additional criteria for these participants included being overweight or obese, and they were primed to reflect on increasing physical activity as part of their engagement with FFW. In both cohorts, no statistically significant demographic differences were identified between experimental and control groups.

In the 2015 study, the number of participants who answered quantitative items ranged from 126 to 127, depending on the particular question. In the 2019 study, the number of respondents ranged from 246 to 249, again, depending on the particular item. Combining the two studies, the range of participants who answered the items ranged from 372 for some questions to 376 for others. This is a robust number upon which to draw some inferences regarding user engagement with FFW.

Ninety-one participants provided qualitative responses. Participants were asked to select a response to seven different demographic variables. Regarding Sex, 66% of respondents selected “Female”, 24% selected “Male”, and 1 respondent did not answer. The most common Age group was 35–54, representing 51.6% of respondents; notably, no respondents over the age of 65 were included in our sample. For the Race/Ethnicity item, 66% of respondents selected “White”, 15% selected “Hispanic or Latino”, 10% selected “Black or African American”, 3% selected “Asian”, and 1% selected each of “Native Hawaiian or Other Pacific Islander” and “American Indian or Alaskan Native.” Among other variables, 59% of respondents selected “Married”; 71% indicated that they were employed full-time and 13% indicated they were unemployed; 57% had college education or were graduates; and 47% had income between US $50,000—$99,999, with 23% reporting incomes below that grouping and 15% providing no response. In line with recruitment eligibility for both RCTs, all participants were residents of the United States. Some implications of our participant demographics are discussed under limitations, below.

User Experience Measures

Fifteen quantitative questions were delivered to participants in the 2015 study. Two additional questions designed to measure physical activity were added to the 2019 survey, resulting in 17 items total. In 2015, questions were scored according to a 5-point scale ranging from 0 (strongly disagree) to 4 (strongly agree), with two representing neither agree nor disagree. In 2019, the neutral midpoint was removed, resulting in a four-point scale ranging from 0 (strongly disagree) to 3 (strongly agree). This change was made between RCTs in order to reduce ambiguity and more clearly distinguish positive from negative user experiences. Because of the limited number of options, we considered these scales to be categorical.

Both groups of participants were also asked a single open-response question, reading “Feel free to add additional comments about your experience with Fun For Wellness.”

Analyses

Two analyses were conducted. First, for each touchpoint, the percentage of responses selecting either agree or strongly agree (Agreement) and disagree or strongly disagree (Disagreement) was calculated for each item. These scores are reported for each item in Table 1. Second, qualitative data from both studies were pooled and analyzed for basic sentiment and themes. These data are summarized in Table 2.

TABLE 2
www.frontiersin.org

TABLE 2. Qualitative responses by themes.

Results

Engagement and Accessibility Items

Overall, 10 of 16 survey items related to engagement factors. The most-agreed items in the 2015 survey included The technology was easy to use (Agreement = 85.7%), The text was easy to follow (Agreement = 81%), and My overall experience with this program was positive (Agreement = 80.3%). Among 2019 responses, the most-agreed items were The technology was easy to use (Agreement = 91.9%), The program was enjoyable (Agreement = 91.9%), and My overall experience with this program was positive (Agreement = 94.4%)

The least-agreed engagement items, common to both survey touchpoints, included Having someone speak the written text was helpful (Agreement 2015/2019 = 45.7/82.1%), The exercises were useful (Agreement = 66.7/86.2%), and The length of modules was appropriate (Agreement = 64.6/86.2%).

All other engagement related items featured agreement percentages above 86.5% in the 2019 survey and above 58.3% in the 2015 survey.

Outcomes Items

Overall, six items concerned program outcomes. The two least-agreed were I feel more confident in my ability to engage in a recommended amount of weekly physical activity for health (Agreement = 81.0%), which was only asked in 2019, and I learned important skills to improve my well-being (Agreement 2015/2019 = 71.7/88.2%). The two most agreed were I plan to use the skills I learned (Agreement = 76.4/91.5%) and I would recommend this program to a friend (Agreement = 66.9/92.4%) (cf. Hamilton et al., 2014).

All engagement and outcomes items, along with agreement percentage and descriptive statistics, can be reviewed in Table 1.

Qualitative Analysis

Comments were analyzed according to two general classification schemes. First, in an overall sentiment analysis, 75.8% of cases were identified as positive, 6.6% were mixed, neutral, or unclear, and 5.5% were negative. Brief examples of positive, mixed, and negative items include:

Positive: “IT WAS FUN!”

Mixed/Neutral/Unclear: “Thank you for letting me be a part of this program.”

Negative: “poor”

Next, comments were analyzed for emerging themes. Six major themes, each appearing in at least 10% of all responses, were identified. All themes and sub-themes with N counts are displayed along with sample quotations in Table 2.

Theme 1: Experience of Emotion

This theme encapsulates participants’ articulations of various emotional states prompted by their engagement with FFW. In total, 41%, or 37 of the 91 comments, aligned with this theme. Of these, all but one referenced positive states. In particular, 24 respondents identified feelings of happiness, fun, or enjoyment (e.g., “I enjoyed this very much.”); eight identified pleasant feelings or liking the experience (e.g., “The investigation was pleasant”); two mentioned being interested (e.g., This was interesting to do…“); and two mentioned feeling motivated or empowered (e.g., This experience was empowering.“). The sole negative affective comment was, “The games were slightly tedious.”

Theme 2: Favorability of Fun For Wellness

Thirty of 91 comments featured some sort of summative judgement of the FFW experience. Of these, 26 were favorable, with nine highly favorable, and four unfavorable. Highly favorable responses included This is a wonderful program I would recommend to others. The tips and examples were easy to follow and very positive…” and “Excellent tool. It was very dynamic.” Favorable responses included weaker positive descriptive (e.g., “good way to improving [sic] myself”). Negative comments included “poor”, “too repetitive”, and “…incredibly time intensive.” One mixed comment noted that goal setting was “useful” but “The games were slightly tedious.”

Theme 3: Learning

Twenty-five comments addressed experiences of learning in the program. These were divided into general comments about learning via FFW (N = 15; e.g., “Fun For Wellness is a good way to learn something.”) and specific experiences of learning knowledge, skills, and abilities (N = 10; e.g., “Thanks to [FFW] I know how to improve my relationship with my spouse”). By contrast, one participant communicated a lack of enduring learning: “I don’t really remember what I learned.”

Theme 4: Impact of FFW on Participant’s Life

Twenty total comments articulated ways in which participation in FFW had a positive impact on their life, such as encouraging them to take action or providing them with useful knowledge. No comments reflected a negative impact. Examples include “...I was able to use a lot of what I learned at home with my family.” Another participant shared, “I appreciated that even though there are things I know I should be doing; the reminder and reinforcement to actually take action on the items was very helpful. The exercises here were fun/engaging and a reminder kick in the rear that there are things I'm not doing that I actually need to do. It gave me a chance to check in; evaluate; and take accountability.”

Theme 5: Expressions of Gratitude

Seventeen comments expressed gratitude for being part of the experience. Examples include comments such as “I am grateful for the opportunity to use this software and learn the techniques taught in fun for wellness. Thank you.” and “Thank you for letting me be a part of this program.”

Theme 6: Wants to Do More With Fun For Wellness

Nine participants expressed a desire to engage further with FFW. For example, a 2019 participant stated, “Would like to see more programs like this to delve deeper to improve health and reinforce positive steps.” Similarly, a 2015 participant shared, “I really enjoyed this program and would continue to participate if it continued…”

Discussion

Whereas FFW has shown positive outcomes in terms of efficacy (Myers et al., 2017b; Myers, Dietz, et al., 2018; and; Myers et al., 2017a) and effectiveness (Myers, Prilleltensky, et al., 2019; Myers et al., 2020; Prilleltensky et al., 2020; Lee, 2019; Myers et al., 2021), this is the first article to report on user engagement. Overall, the data indicate that FFW is quite appealing to users. In the 2019 study, the participants who reported satisfaction in terms of engagement and outcomes ranged from 81 to 94.4%. The overall rates of positivity for the 2015 study are lower because the questionnaire included a “neutral” option. Still, the range of positive responses in that investigation ranged from 45.7 to 85.7%. Perhaps a better measure of engagement with FFW derives from the low percentage of participants who “disagreed” that FFW was engaging. In the 2015 study they ranged from 0.8 to 14.2%, whereas in the 2019 evaluation they ranged from 5.6 to 19%.

In terms of actual usage of the intervention, our 2019 study found that 81.9% of participants met the criteria for engagement, which was to accumulate 21 points based on duration and content of challenges. In the same study, the average number of BET I CAN challenges completed was 57.4 out of 152, representing 37.8% completion. Although we wished the percentage of completed tasks would have been higher, it was more than enough to classify 81.9% of participants as being engaged with the program, based on our definition. The qualitative data reported above shows enthusiasm for FFW and an overall appreciation for the intervention.

We attribute the reasonable level of engagement with the program to seven attributes that have been identified in the literature: rewards, including a small monetary incentive (Kelders, 2019), gamification (Baranowski et al., 2008; Kelders, 2019), scenario-based learning or case studies (Irvine et al., 2015), skill building (Conley et al., 2015), humor (Wellenzohn et al., 2016), multimodal (Cobb and Poirier, 2014), and visual tracking (Bennet and Glasgow, 2009; Chung et al., 2017). With respect to rewards, the software was programed in such a way that participants receive applause and visual cheers for completing challenges. Participants could also earn about $ 50 for completing all the test batteries. Our overall approach was to engage the user with multiple modalities to elicit behavioral, cognitive, and affective responses. Each skill was taught through a variety of games, self-reflection exercises, mini-coaching sessions and videos with professional actors. We wanted to generate interest through a multi-sensorial experience. By encouraging laughing, thinking, listening, playing games, writing, and watching videos we employed a variety of pedagogical instruments that have been shown to enhance attention. Our approach is in line with the VARK methodology, according to which online learning is enhanced by the use of visual, aural, read/write and kinesthetic modalities (Lee, 2019).

In addition, about fifteen percent of the challenges were video games. These were interactive games that reinforced the concepts being taught. Beyond this, gamification elements included clear objectives, progress bars and completion indicators, completion statistics, novelty, meaningful narrative, and multisensory experiences (Toda et al., 2019) We also used scenario-based learning in the form of case studies with professional actors. These were short vignettes, lasting anywhere from one to 2 min. The videos represented about twenty percent of all the challenges.

Skill building was at the core of the intervention. We taught participants fourteen specific skills associated with seven drivers of change we call BET I CAN (behaviors, emotions, thoughts, interactions, context, awareness, and next steps). For example, under behaviors, we taught participants how to create positive habits. Under emotions we taught them how to cultivate positive feelings. Under interactions participants learned how to connect and communicate with others.

Humor was built in into the case studies, but there were also humor breaks where the second author, an award-winning humor writer, told real and imagined funny stories using the concepts being taught. We purposefully used multiple modes of learning since people learn in different ways. We surmised that some would enjoy games more than mini-coaching sessions, whereas others would relate more to the videos than to the reflection exercises. The multimodal nature of the intervention addressed personality differences and the need to vary the pedagogy to maintain participant interest. Finally, with regards to visual tracking, users could monitor their status through a prominent progress bar.

The qualitative themes of learning, gratitude, enjoyment and positive impact support the quantitative results and our conjectures regarding the role of skill building, humor, and scenario-based learning. Very few participants objected to aspects of the intervention, with some noting that a few activities were repetitive and that FFW was very time consuming. This is useful feedback to the research team working to refine FFW.

Although participants did not mention it, one aspect that we failed to foster in FFW was social support. Even though there were chat rooms for participants to interact with one another, that feature was not used. We know from previous research that this is an important aspect of successful online interventions (Barry et al., 2018; Khaylis et al., 2010; Nicholas et al., 2012) and important driver of engagement (Poirier and Cobb, 2012), and we should reinforce it in future version of FFW.

Overall, our objective data pertaining to task completion and our subjective data regarding user experience demonstrate that participants enjoyed FFW and that they were moderately engaged. Our user experience data aligns well with Kelders (2019) definition of engagement, consisting of behavioral, cognitive and affective aspects. We measured behavior in terms of task completion (i.e., 81.9% of participants met the criteria for user engagement), cognition in terms of learning acquisition (e.g., I learned important skills to improve my well-being); and affect in terms of emotions elicited by the program (i.e., the program was enjoyable).

We believe computer scientists, behavioral, and health specialists engaged in the creation of online programs should pay attention to both objective engagement criteria—such as time on task and task completion—and subjective reports such as enjoyment and learning. The complementary nature of quantitative and qualitative data in user experience seems very important in our growing understanding of engagement drivers (Perski et al., 2017).

Limitations and Conclusion

Several limitations of the present study warrant discussion. Most importantly, our data cannot tell us which of the seven attributes listed above are the most or least important drivers of engagement. Our evaluation did not go deep enough to discern the differential impact of each type of challenge. Future investigations should investigate that to determine the most engaging and efficient way to deliver online interventions.

Another limitation of the present survey involves recruitment. As participants in an RCT study, our respondents were compensated for their data and were voluntarily surveyed. The general population, who would not be compensated for using FFW, might have differences in experience. Relatedly, respondents were participants who completed the program; our data, therefore, do not speak to the experiences of those who do not complete the program due to low engagement or other attrition factors. Further, our sample skewed towards a white, middle-class, married audience, and was restricted to the United States. Further studies should investigate the extent to which these findings hold with more diverse, traditionally understudied, and international populations.

Nevertheless, in our view it is likely that successful interventions will have to retain multimodality as a key driver of engagement. At this early stage of development in the deployment of online interventions for health and well-being, it seems to us that researchers and interventionists should continue to use rewards, scenario-based learning, humor, gamification, visual progress cues, and skill-building.

FFW was born out of the realization that there was a need for a universal, engaging, online, problem-centered, skill-based program to improve subjective well-being and physical activity. Moreover, we wanted to build a platform that would integrate the best features of multiple approaches to behavioral change. As noted above, RCT studies have shown that FFW is both efficacious and effective in improving several aspects of subjective well-being, physical activity, and self-efficacy. The current study adds the user experience and engagement to the body of evidence concerning the usefulness of FFW. Online interventions like FFW have the advantage of scalability, confidentiality, accessibility, affordability, and interactivity. In light of global concerns related to the deterioration of mental and physical health of the population, especially in a pandemic context, the improvement of online programs such as FFW seems critical to reach vast sectors of the populations. We encourage computer scientists and health experts to continue to improve online programs until we make a significant dent in the epidemic of poor health afflicting many countries.

Data Availability Statement

The raw data supporting the conclusion of this article will be made available by the authors, without undue reservation.

Ethics Statement

The studies involving human participants were reviewed and approved by the University of Miami Institutional Review Board. The patients/participants provided their written informed consent to participate in this study.

Author Contributions

MS wrote most of the article. IP contributed sections of the paper, including the discussion. AM developed the software for FFW. OP developed a great deal of the content of FFW. The rest of the authors contributed to data collection and data analysis and provided useful edits to the manuscript.

Funding

This project was funded by the Erwin and Barbara Mautner Endowed Chair in Community Well- Being at the University of Miami. The second author of this paper is the holder of the Endowed Chair.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

Afshin, A., Babalola, D., Mclean, M., Yu, Z., Ma, W., Chen, C. Y., et al. (2016). Information Technology and Lifestyle: A Systematic Evaluation of Internet And Mobile Interventions for Improving Diet, Physical Activity, Obesity, Tobacco, and Alcohol Use. J. Am. Heart Assoc. 5 (9), e003058. doi:10.1161/JAHA.115.003058

Ahtinen, A., Mattila, E., Välkkynen, P., Kaipainen, K., Vanhala, T., Ermes, M., et al. (2013). Mobile Mental Wellness Training for Stress Management: Feasibility and Design Implications Based on a One-Month Field Study. JMIR Mhealth Uhealth 1 (2), e11. doi:10.2196/mhealth.2596

PubMed Abstract | CrossRef Full Text | Google Scholar

Aikens, K. A., Astin, J., Pelletier, K. R., Levanovich, K., Baase, C. M., Park, Y. Y., et al. (2014). Mindfulness Goes to Work. J. Occup. Environ. Med. 56 (7), 721–731. doi:10.1097/jom.0000000000000209

PubMed Abstract | CrossRef Full Text | Google Scholar

Aneni, E. C., Roberson, L. L., Maziak, W., Agatston, A. S., Feldman, T., Rouseff, M., et al. (2014). A Systematic Review of Internet-Based Worksite Wellness Approaches for Cardiovascular Disease Risk Management: Outcomes, Challenges & Opportunities. PLoS One 9 (1), e83594. doi:10.1371/journal.pone.0083594

PubMed Abstract | CrossRef Full Text | Google Scholar

Arnold, C., Villagonzalo, K.-A., Meyer, D., Farhall, J., Foley, F., Kyrios, M., et al. (2019). Predicting Engagement with an Online Psychosocial Intervention for Psychosis: Exploring Individual- and Intervention-Level Predictors. Internet interventions 18, 100266. doi:10.1016/j.invent.2019.100266

PubMed Abstract | CrossRef Full Text | Google Scholar

Baltierra, N. B., Muessig, K. E., Pike, E. C., LeGrand, S., Bull, S. S., Hightow-Weidman, L. B., et al. (2016). More Than Just Tracking Time: Complex Measures of User Engagement With an Internet-Based Health Promotion Intervention. J. Biomed. Inform. 59, 299–307. doi:10.1016/j.jbi.2015.12.015

Barak, A., and Grohol, J. M. (2011). Current and Future Trends in Internet-Supported Mental Health Interventions. J. Techn. Hum. Serv. 29 (3), 155–196. doi:10.1080/15228835.2011.616939

CrossRef Full Text | Google Scholar

Baranowski, T., Buday, R., Thompson, D. I., and Baranowski, J. (2008). Playing for Real. Am. J. Prev. Med. 34 (1), 74–82. doi:10.1016/j.amepre.2007.09.027

PubMed Abstract | CrossRef Full Text | Google Scholar

Barry, M. C., Threats, M., Blackburn, N. A., LeGrand, S., Dong, W., Pulley, D. V., et al. (2018). Stay strong! keep ya head up! move on! it gets better!!!!: Resilience Processes in the Healthmpowerment Online Intervention of Young Black Gay, Bisexual and Other Men who Have Sex With Men. AIDS care 30, S27–S38. doi:10.1080/09540121.2018.1510106

Bennett, G. G., and Glasgow, R. E. (2009). The Delivery of Public Health Interventions via the Internet: Actualizing Their Potential. Annu. Rev. Public Health 30, 273–292. doi:10.1146/annurev.publhealth.031308.100235

PubMed Abstract | CrossRef Full Text | Google Scholar

Bouwman, T., van Tilburg, T., and Aartsen, M. (2019). Attrition in an Online Loneliness Intervention for Adults Aged 50 Years and Older: Survival Analysis. JMIR Aging 2 (2), e13638. doi:10.2196/13638

PubMed Abstract | CrossRef Full Text | Google Scholar

Brindal, E., Freyne, J., Saunders, I., Berkovsky, S., Smith, G., and Noakes, M. (2012). Features Predicting Weight Loss in Overweight or Obese Participants in a Web-Based Intervention: Randomized Trial. J. Med. Internet Res. 14 (6), e173. doi:10.2196/jmir.2156

PubMed Abstract | CrossRef Full Text | Google Scholar

Brouwer, W., Oenema, A., Crutzen, R., de Nooijer, J., de Vries, N. K., and Brug, J. (2009). What Makes People Decide to Visit and Use an Internet‐delivered Behavior‐change Intervention? Health. Educ 109 (6), 460-473. doi:10.1108/09654280911001149

CrossRef Full Text | Google Scholar

Chou, T., Bry, L. J., and Comer, J. S. (2017). Multimedia Field Test: Evaluating the Creative Ambitions of SuperBetter and its Quest to Gamify Mental Health. Cogn. Behav. Pract. 24 (1), 115–120. doi:10.1016/j.cbpra.2016.10.002

CrossRef Full Text | Google Scholar

Chung, C. F., Agapie, E., Schroeder, J., Mishra, S., Fogarty, J., and Munson, S. A. (2017). When Personal Tracking Becomes Social: Examining the Use of Instagram for Healthy Eating. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 1674–1687.

PubMed Abstract | CrossRef Full Text | Google Scholar

Cobb, N. K., and Poirier, J. (2014). Effectiveness of a Multimodal Online Well-Being Intervention. Am. J. Prev. Med. 46 (1), 41–48. doi:10.1016/j.amepre.2013.08.018

PubMed Abstract | CrossRef Full Text | Google Scholar

Conley, C. S., Durlak, J. A., and Kirsch, A. C. (2015). A Meta-Analysis of Universal Mental Health Prevention Programs for Higher Education Students. Prev. Sci. 16 (4), 487–507. doi:10.1007/s11121-015-0543-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Couper, M. P., Alexander, G. L., Zhang, N., Little, R. J., Maddy, N., Nowak, M. A., et al. (2010). Engagement and Retention: Measuring Breadth and Depth of Participant Use of an Online Intervention. J. Med. Internet Res. 12 (4), e52. doi:10.2196/jmir.1430

PubMed Abstract | CrossRef Full Text | Google Scholar

Craig Lefebvre, R., Tada, Y., Hilfiker, S. W., and Baur, C. (2010). The Assessment of User Engagement with eHealth Content: The eHealth Engagement Scale1. J. Computer-Mediated Commun. 15 (4), 666–681. doi:10.1111/j.1083-6101.2009.01514.x

CrossRef Full Text | Google Scholar

Decancq, K., and Lugo, M. A. (2013). Weights in Multidimensional Indices of Wellbeing: An Overview. Econometric Rev. 32 (1), 7–34. doi:10.1080/07474938.2012.690641

CrossRef Full Text | Google Scholar

Doherty, G., Coyle, D., and Sharry, J. (2012). “Engagement with Online Mental Health Interventions: an Exploratory Clinical Study of a Treatment for Depression,” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1421–1430.

Google Scholar

Ebert, D. D., Lehr, D., Smit, F., Zarski, A. C., Riper, H., Heber, E., et al. (2014). Efficacy and Cost-Effectiveness of Minimal Guided and Unguided Internet-Based mobile Supported Stress-Management in Employees with Occupational Stress: a Three-Armed Randomised Controlled Trial. BMC Public Health 14 (1), 807. doi:10.1186/1471-2458-14-807

PubMed Abstract | CrossRef Full Text | Google Scholar

Hamilton, D. F., Lane, J. V., Gaston, P., Patton, J. T., Macdonald, D. J., Simpson, A. H. R. W., et al. (2014). Assessing Treatment Outcomes Using a Single Question. Bone Jt. J. 96 (5), 622–628. doi:10.1302/0301-620x.96b5.32434

PubMed Abstract | CrossRef Full Text | Google Scholar

Irvine, A. B., Gelatt, V. A., Hammond, M., and Seeley, J. R. (2015). A Randomized Study of Internet Parent Training Accessed from Community Technology Centers. Prev. Sci. 16 (4), 597–608. doi:10.1007/s11121-014-0521-z

PubMed Abstract | CrossRef Full Text | Google Scholar

Johnson, D., Deterding, S., Kuhn, K.-A., Staneva, A., Stoyanov, S., and Hides, L. (2016). Gamification for Health and Wellbeing: A Systematic Review of the Literature. Internet Interventions 6, 89–106. doi:10.1016/j.invent.2016.10.002

PubMed Abstract | CrossRef Full Text | Google Scholar

Joseph, R. P., Durant, N. H., Benitez, T. J., and Pekmezi, D. W. (2014). Internet-based Physical Activity Interventions. Am. J. Lifestyle Med. 8 (1), 42–67. doi:10.1177/1559827613498059

PubMed Abstract | CrossRef Full Text | Google Scholar

Kelders, S. M. (2019). “Design for Engagement of Online Positive Psychology Interventions,” inPositive Psychological Intervention Design and Protocols for Multi-Cultural Contexts. Cham: Springer, 297–313. doi:10.1007/978-3-030-20020-6_13

CrossRef Full Text | Google Scholar

Khaylis, A., Yiaslas, T., Bergstrom, J., and Gore-Felton, C. (2010). A Review of Efficacious Technology-Based Weight-Loss Interventions: Five Key Components. Telemed. e-Health 16 (9), 931–938. doi:10.1089/tmj.2010.0065

CrossRef Full Text | Google Scholar

Lee, Y. j. J. (2019). Integrating Multimodal Technologies with VARK Strategies for Learning and Teaching EFL Presentation: An Investigation into Learners' Achievements and Perceptions of the Learning Process. Ajal 2 (1), 17–31. doi:10.29140/ajal.v2n1.118

CrossRef Full Text | Google Scholar

Lewis, B. A., Napolitano, M. A., Buman, M. P., Williams, D. M., and Nigg, C. R. (2017). Future Directions in Physical Activity Intervention Research: Expanding Our Focus to Sedentary Behaviors, Technology, and Dissemination. J. Behav. Med. 40 (1), 112–126. doi:10.1007/s10865-016-9797-8

PubMed Abstract | CrossRef Full Text | Google Scholar

Linardon, J., and Fuller-Tyszkiewicz, M. (2020). Attrition and Adherence in Smartphone-Delivered Interventions for Mental Health Problems: A Systematic and Meta-Analytic Review. J. Consulting Clin. Psychol. 88 (1), 1–13. doi:10.1037/ccp0000459

CrossRef Full Text | Google Scholar

Lobban, F., Jones, S., Robinson, H., and Sellwood, B. (2019). Online Randomised Controlled Trial to Evaluate the Clinical and Cost-Effectiveness of a Web-Based Peer-Supported Self-Management Intervention for Relatives of People with Psychosis or Bipolar Disorder: Relatives’ Education and Coping Toolkit (REACT). Health Tech Assessment

Google Scholar

Maher, C. A., Lewis, L. K., Ferrar, K., Marshall, S., De Bourdeaudhuij, I., and Vandelanotte, C. (2014). Are Health Behavior Change Interventions that Use Online Social Networks Effective? A Systematic Review. J. Med. Internet Res. 16 (2), e40. doi:10.2196/jmir.2952

PubMed Abstract | CrossRef Full Text | Google Scholar

Martin, C. K., Miller, A. C., Thomas, D. M., Champagne, C. M., Han, H., and Church, T. (2015). Efficacy of SmartLossSM , a Smartphone-Based Weight Loss Intervention: Results from a Randomized Controlled Trial. Obesity 23 (5), 935–942. doi:10.1002/oby.21063

PubMed Abstract | CrossRef Full Text | Google Scholar

McClure, J. B., Shortreed, S. M., Bogart, A., Derry, H., Riggs, K., St John, J., et al. (2013). The Effect of Program Design on Engagement with an Internet-Based Smoking Intervention: Randomized Factorial Trial. J. Med. Internet Res. 15 (3), e69. doi:10.2196/jmir.2508

PubMed Abstract | CrossRef Full Text | Google Scholar

Morrison, C., and Doherty, G. (2014). Analyzing Engagement in a Web-Based Intervention Platform through Visualizing Log-Data. J. Med. Internet Res. 16 (11), e252. doi:10.2196/jmir.3575

PubMed Abstract | CrossRef Full Text | Google Scholar

Mueller, K., Prins, R., and de Heer, H. D. (2018). An Online Intervention Increases Empathy, Resilience, and Work Engagement Among Physical Therapy Students. J. Allied Health 47 (3), 196–203.

PubMed AbstractGoogle Scholar

Myers, N. D., Prilleltensky, I., Hill, C. R., and Feltz, D. L. (2017a). Well-being Self-Efficacy and Complier Average Causal Effect modelingA Substantive-Methodological Synergy, Psychol. Sport Exerc. 30, 135–144. doi:10.1016/j.psychsport.2017.02.010

CrossRef Full Text | Google Scholar

Myers, N. D., Prilleltensky, I., Prilleltensky, O., McMahon, A., Dietz, S., Rubenstein, C. L., et al. (2017b). Efficacy of the Fun for Wellness Online Intervention to Promote Multidimensional Well-Being: A Randomized Controlled Trial. Prev. Sci., 18, 984–994. 10.1007/s11121-017-0779-z

PubMed Abstract | CrossRef Full Text | Google Scholar

Myers, N. D., Dietz, S., Prilleltensky, I., Prilleltensky, O., McMahon, A., Rubenstein, C. L., et al. (2018). Efficacy of the Fun for Wellness Online Intervention to Promote Well-Being Actions: A Secondary Data Analysis. Games Health J. 7 (4), 225–239. doi:10.1089/g4h.2017.0132

PubMed Abstract | CrossRef Full Text | Google Scholar

Myers, N. D., Prilleltensky, I., Lee, S., Dietz, S., Prilleltensky, O., McMahon, A., et al. (2019). Effectiveness of the Fun for Wellness Online Behavioral Intervention to Promote Well-Being and Physical Activity: Protocol for a Randomized Controlled Trial. BMC Public Health 19, 737. 10.1186/s12889-019-7089-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Myers, N. D., Prilleltensky, I., McMahon, A., Lee, S., Dietz, S., Prilleltensky, O., et al. (2021). Effectiveness of the Fun for Wellness Online Behavioral Intervention to Promote Subjective Well-Being in Adults with Obesity: A Randomized Controlled Trial. J. Happiness Stud. 22 (4), 1905–1923. doi:10.1007/s10902-020-00301-0

Google Scholar

Myers, N. D., McMahon, A., Prilleltensky, I., Lee, S., Dietz, S., Prilleltensky, O., et al. (2020). Effectiveness of the Fun for Wellness Web-Based Behavioral Intervention to Promote Physical Activity in Adults with Obesity: A Randomized Controlled Trial. J. Med. Internet Res. Formative Res. 4, e15919. doi:10.2196/15919

PubMed Abstract | CrossRef Full Text | Google Scholar

Neumeier, L. M., Brook, L., Ditchburn, G., and Sckopke, P. (2017). Delivering Your Daily Dose of Well-Being to the Workplace: A Randomized Controlled Trial of an Online Well-Being Programme for Employees. Eur. J. Work Organizational Psychol. 26 (4), 555–573. doi:10.1080/1359432x.2017.1320281

CrossRef Full Text | Google Scholar

Nicholas, J., Proudfoot, J., Parker, G., Gillis, I., Burckhardt, R., Manicavasagar, V., et al. (2010). The Ins and Outs of an Online Bipolar Education Program: a Study of Program Attrition. J. Med. Internet Res. 12 (5), e57. doi:10.2196/jmir.1450

PubMed Abstract | CrossRef Full Text | Google Scholar

Nicholas, D. B., Fellner, K. D., Frank, M., Small, M., Hetherington, R., Slater, R., et al. (2012). Evaluation of an Online Education and Support Intervention for Adolescents with Diabetes. Soc. Work Health Care 51 (9), 815–827. doi:10.1080/00981389.2012.699507

PubMed Abstract | CrossRef Full Text | Google Scholar

Norcross, J. C. (2012). Changeology: 5 Steps to Realizing Your Goals and Resolutions. New York, NY: Simon & Schuster. doi:10.1093/oxfordhb/9780195388923.013.0021

CrossRef Full Text

Norman, G. J., Zabinski, M. F., Adams, M. A., Rosenberg, D. E., Yaroch, A. L., and Atienza, A. A. (2007). A Review of eHealth Interventions for Physical Activity and Dietary Behavior Change. Am. J. Prev. Med. 33 (4), 336–345. doi:10.1016/j.amepre.2007.05.007

PubMed Abstract | CrossRef Full Text | Google Scholar

Palermo, T. M., Wilson, A. C., Peters, M., Lewandowski, A., and Somhegyi, H. (2009). Randomized Controlled Trial of an Internet-Delivered Family Cognitive–Behavioral Therapy Intervention for Children and Adolescents with Chronic Pain. Pain 146 (1-2), 205–213. doi:10.1016/j.pain.2009.07.034

PubMed Abstract | CrossRef Full Text | Google Scholar

Perski, O., Blandford, A., West, R., and Michie, S. (2017). Conceptualising Engagement with Digital Behaviour Change Interventions: a Systematic Review Using Principles from Critical Interpretive Synthesis. Behav. Med. Pract. Pol. Res. 7 (2), 254–267. doi:10.1007/s13142-016-0453-1

CrossRef Full Text | Google Scholar

Poirier, J., and Cobb, N. K. (2012). Social Influence as a Driver of Engagement in a Web-Based Health Intervention. J. Med. Internet Res. 14 (1), e36. doi:10.2196/jmir.1957

PubMed Abstract | CrossRef Full Text | Google Scholar

Prilleltensky, I., Dietz, S., Prilleltensky, O., Myers, N. D., Rubenstein, C. L., Jin, Y., et al. (2015). Assessing Multidimensional Well-Being: Development and Validation of the I Coppe Scale. J. Community Psychol. 43 (2), 199–226. doi:10.1002/jcop.21674

CrossRef Full Text | Google Scholar

Prilleltensky, I., McMahon, A., Myers, N. D., Prilleltensky, O., Dietz, S., Scarpa, M. P., et al. (2020). An Exploration of the Effectiveness of the Fun for Wellness Online Intervention to Promote Health in Adults with Obesity: A Randomized Controlled Trial. J. Prev. Health Promot. 1, 212–239. doi:10.1177/2632077020968737

CrossRef Full Text | Google Scholar

Schubart, J. R., Stuckey, H. L., Ganeshamoorthy, A., and Sciamanna, C. N. (2011). Chronic Health Conditions and Internet Behavioral Interventions. Comput. Inform. Nurs. CIN 29 (2), 81–92. doi:10.1097/ncn.0b013e3182065eed

PubMed Abstract | CrossRef Full Text | Google Scholar

Schwarzer, R., and Satow, L. (2012). Online Intervention Engagement Predicts Smoking Cessation. Prev. Med. 55 (3), 233–236. doi:10.1016/j.ypmed.2012.07.006

PubMed Abstract | CrossRef Full Text | Google Scholar

Short, C., Rebar, A., Plotnikoff, R., and Vandelanotte, C. (2015). Designing Engaging Online Behaviour Change Interventions: a Proposed Model of User Engagement. Eur. Health Psychol. 17 (1), 32–38.

Google Scholar

Short, C. E., DeSmet, A., Woods, C., Williams, S. L., Maher, C., Middelweerd, A., et al. (2018). Measuring Engagement in eHealth and mHealth Behavior Change Interventions: Viewpoint of Methodologies. J. Med. Internet Res. 20 (11), e292. doi:10.2196/jmir.9397

PubMed Abstract | CrossRef Full Text | Google Scholar

Taki, S., Lymer, S., Russell, C. G., Campbell, K., Laws, R., Ong, K.-L., et al. (2017). Assessing User Engagement of an mHealth Intervention: Development and Implementation of the Growing Healthy App Engagement index. JMIR Mhealth Uhealth 5 (6), e89. doi:10.2196/mhealth.7236

PubMed Abstract | CrossRef Full Text | Google Scholar

Toda, A. M., Oliveira, W., Klock, A. C., Palomino, P. T., Pimenta, M., Gasparini, L., et al. (2019). A taxonomy of game elements for gamification in educational contexts: Proposal and evaluation. In 2019 IEEE 19th International Conference on Advanced Learning Technologies (ICALT), (IEEE), Vol. 2161, 84–88.

Vandelanotte, C., Spathonis, K. M., Eakin, E. G., and Owen, N. (2007). Website-Delivered Physical Activity Interventions. Am. J. Prev. Med. 33 (1), 54–64. doi:10.1016/j.amepre.2007.02.041

PubMed Abstract | CrossRef Full Text | Google Scholar

Wellenzohn, S., Proyer, R. T., and Ruch, W. (2016). Humor-based Online Positive Psychology Interventions: A Randomized Placebo-Controlled Long-Term Trial. J. Positive Psychol. 11 (6), 584–594. doi:10.1080/17439760.2015.1137624

CrossRef Full Text | Google Scholar

Wright, J. H., and Caudill, R. (2020). Remote Treatment Delivery in Response to the COVID-19 Pandemic. Psychotherapy and Psychosomatics 89 (3), 1. doi:10.1159/000507376

PubMed Abstract | CrossRef Full Text | Google Scholar

Yardley, L., Spring, B. J., Riper, H., Morrison, L. G., Crane, D. H., Curtis, K., et al. (2016). Understanding and Promoting Effective Engagement with Digital Behavior Change Interventions. Am. J. Prev. Med. 51 (5), 833–842. doi:10.1016/j.amepre.2016.06.015

PubMed Abstract | CrossRef Full Text | Google Scholar

Young, L. E., Soliz, S., Xu, J. J., and Young, S. D. (2020). A Review of Social media Analytic Tools and Their Applications to Evaluate Activity and Engagement in Online Sexual Health Interventions. Prev. Med. Rep. 19, 101158. doi:10.1016/j.pmedr.2020.101158

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: engagement, fun for wellness, user experience, online intervention, well-being, multidimensional well-being, self-efficacy

Citation: Scarpa MP, Prilletensky I, McMahon A, Myers ND, Prilleltensky O, Lee S, Pfeiffer KA, Bateman AG and Brincks AM (2021) Is Fun For Wellness Engaging? Evaluation of User Experience of an Online Intervention to Promote Well-Being and Physical Activity. Front. Comput. Sci. 3:690389. doi: 10.3389/fcomp.2021.690389

Received: 02 April 2021; Accepted: 03 June 2021;
Published: 18 June 2021.

Edited by:

Leon Sterling, Swinburne University of Technology, Australia

Reviewed by:

Diego Muñoz, Swinburne University of Technology, Australia
Galena Pisoni, University of Nice Sophia Antipolis, France
Antonette Mendoza, The University of Melbourne, Australia

Copyright © 2021 Scarpa, Prilletensky, McMahon, Myers, Prilleltensky, Lee, Pfeiffer, Bateman and Brincks. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Michael P. Scarpa, mps162@miami.edu

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.