- 1School of Exercise and Nutrition Sciences, Deakin University, Geelong, VIC, Australia
- 2Health Educational Development Unit, Deakin University, Geelong, VIC, Australia
- 3Centre for Quality and Patient Safety Research in Institute of Health Transformation, School of Nursing and Midwifery, Deakin University, Geelong, VIC, Australia
- 4Deakin Learning Futures, Office of the Deputy Vice Chancellor (Education), Deakin University, Geelong, VIC, Australia
Post-graduate programs attract older students, who often work part-time or full-time and have child-care responsibilities. In the Information Age, online learning environments can help these students to meet their learning objectives more efficiently and provide a unique opportunity to address individual learning preferences. The aim of this study was to assess the learning experiences of postgraduate students in an online learning environment delivering content in a guided, self-directed way focusing on active learning opportunities. Two-hundred and eighty-seven students participated in the study. A pragmatic descriptive design with purposive sampling was used to examine the impact of a newly developed active online learning environment on student commitment, performance and satisfaction when compared to a passive, pre-recorded lecture. In contrast to our hypothesis that all metrics would improve with subject redevelopment, student performance and commitment did not improve in the active online learning environment; however, student satisfaction increased significantly. These findings might be partly attributed to the increased cognitive load associated to online learning. This study demonstrates how, for postgraduate students choosing online learning, active learning experiences can be used to provide students with a greater sense of satisfaction while acknowledging for the heterogeneity of the cohort and its different learning preferences. However, in the worldwide context of remote learning rapidly and urgently expanding, it also outlines that online learning needs to be carefully scaffolded to ensure deep learning and that the impact of the transition to online learning on performance and commitment should be considered, especially when directed at non-experienced students.
Introduction
In 2010, 19% of students studying at Australian universities were enrolled in fully online (12%) or multi-modal (partially online, 7%) programs (Australian Bureau of Statistics, 2012). By 2017, it was estimated that these numbers had grown to over 175,000 Australian students (14%) enrolled in a fully online course (University Rankings Australia, 2017). In the worldwide context of online learning rapidly and urgently expanding in response to COVID-19, this trend will only expand exponentially in the coming years. Australian online post-graduate programs typically attract older students, who often work full-time or part-time and have child-care responsibilities within the home (Stoessel et al., 2015), replicating a local and international trend (Jancey and Sharyn, 2013). The emergence of web-based learning technologies has provided a unique opportunity for flexible access to learning. In particular, a growing number of students living in rural/remote areas enroll into post-graduate university degrees (Waschull, 2001; Greenland and Moore, 2014) thanks to ever-increasing internet access (Casey, 2008).
Web-based programs require more learning independence than traditional on campus, highly structured programs. Self-directed learning is a key factor determining student success, and appropriate learning design is critical to engage and motivate students. In the globally expanding context of web-based learning delivery (Casey, 2008) added to the current context of a pandemic, it has become increasingly important for tertiary educators to offer online delivery methods and learning experiences that address the social and cognitively active online context of learning (Akcaoglu and Lee, 2016) and acknowledge the students’ different learning preferences. This study, undertaken before the pandemic, evaluated the redevelopment of an online postgraduate subject offered by Deakin University, Nutritional Biochemistry and Physiology. As with many online courses, the cohort of students enrolled in this subject is characterized by a wide range of ages, academic backgrounds and professional commitments. Creating an online environment that meets diverse students’ learning needs is complex and challenging (Hill et al., 2009; Chita-Tegmark et al., 2011; Gillett-Swan, 2017). It was suggested as early as 2001, when web-based learning was in its infancy, that such an environment may require a unique approach to content delivery (Fetherston, 2001). Innovative, online learning environments can help students to meet their learning objectives more effectively by offering content in multiple formats (written, auditory, interactive) along with providing learning opportunities that are less passive (Laurillard, 1998; Chita-Tegmark et al., 2011; Moorefield-Lang et al., 2016). While the effectiveness of online learning for student satisfaction, commitment and performance are increasingly important research areas in undergraduate education contexts, there is a distinct lack of systematic evidence assessing post-graduate online learning environments.
The content of the Nutritional Biochemistry and Physiology subject is typically found dry and challenging by students as they re-enter higher education, because it addresses foundation knowledge in biochemistry and physiology needed for applied nutrition related topics in the future. Students with little or no science background can also find studying biochemistry and physiology for the first time somewhat confronting. This subject was historically delivered as a series of audio-visual recordings of a 45 min narrated slide show, which received poor evaluations from students. These evaluations were partly attributed to poor design of the online content delivery. In 2018, we implemented a series of changes to the learning design that encouraged engagement, social presence, communication, active learning, interactivity and formative feedback through a series of synchronous and asynchronous activities (Laurillard, 2013; Sun and Chen, 2015; Stone, 2016). These changes were aimed at delivering the subject content in a planned sequence, with multiple opportunities to engage with learning materials in varying formats, but without modifying the amount or type of content that was delivered. Active learning, whether online or in a classroom, requires students to cognitively engage with learning materials and activities specifically designed by the teacher (Bonwell et al., 1991). During active learning, students are required to think, analyze, synthesize, discuss with peers and make decisions, resulting in increased student engagement (Freeman et al., 2014). Our hypothesis was that student commitment, performance and evaluations for the subject would be higher than previously, and that the subject would also rate higher against experiences in concurrently enrolled subjects using the recorded lecture model.
The overarching aim of this study was to assess how a new learning design would influence post-graduate student commitment, performance and satisfaction in the online learning environment. A secondary aim was to assess how each learning activity was perceived by the students, and how it positively or negatively contributed to their learning experience. We hypothesized that delivering content in multiple formats would generally improve the students’ experience of the subject, increase their commitment and performance, and that students’ age and educational background may be important cofactors in this response.
Materials and Methods
Design
A pragmatic, descriptive design using a purposive sample was used to address the research aims. A pragmatic design is appropriate in this context as it allows a matter-of-fact approach of the research question (Cohen et al., 2013).
Sample
This study was approved by the Deakin University Human Ethics Advisory Group (HEAG-H 2018-057). Of the 288 students enrolled in the Deakin University online postgraduate subject Nutritional Biochemistry and Physiology in 2017 and 2018, 287 (134 from 2017, and 153 from 2018) provided informed consent to collect their anonymous data from the university web-based delivery system (99.5% response rate). Out of 153 post-graduate students enrolled in 2018, 75 also agreed to complete the survey (49% response rate).
Subject Redevelopment
For the 2018 academic offering of the subject, 21 core topics were redeveloped and delivered over 11 weeks. Instead of passively listening to an audio-visual recording of a 45 min narrated slide show as per the 2017 curricula, the new mode of delivery was based on an html webpage hosted by Brightspace by D2L learning management system (LMS). Based on best practice recommendations for online learning design (Puzziferro and Shelton, 2008; Laurillard, 2013; Sun and Chen, 2015; Stone, 2016), the learning experiences for each week were scaffolded with deliberate inclusion of educational material and specific communication strategies to actively engage students in the learning process. For each week, the webpage specifically included:
• Short video clips (<3 min) aimed at welcoming the students in a personal way, establishing a safe learning space, introducing each core topic and guiding the students through self-directed learning activities by explaining their purpose and putting them into context of the subject expectations. These clips contributed to maximize social presence (Hostetter and Busch, 2012; Richardson and Swan, 2019) and frame expectations (Tharayil et al., 2018), which are essential to ensure student satisfaction and engagement (Trowler, 2010).
• One to three short narrations of key concepts (<10 min) along with supporting written content focusing on core ideas to ensure constructive alignment (Biggs, 2015).
• One to two videos from external providers and/or links to online interactive activities to ensure student active engagement (Puzziferro and Shelton, 2008; Laurillard, 2013; Sun and Chen, 2015; Stone, 2016).
• One to two links to contemporary readings (e.g., pieces published in The Conversation)1 (Kinash, 2019).
• Non-graded, self-assessment multiple-choice questionnaires (MCQs) that were available each time new concepts had been introduced, as well as at the end of each topic, to encourage learner feedback on knowledge acquired through the previous activities (Hattie and Timperley, 2007; Shute, 2008).
• Finally, students had the choice to attend five synchronous interactive 90 min seminars with the teacher online (via the BlackBoard Collaborate virtual classroom platform) or on-campus (for those living locally). For those students who could not engage synchronously, recordings of these interactive sessions were also offered. These seminars were interactive and an online polling service was used to provide students with the opportunity to participate and interact via their mobile phones, with results displayed to stimulate class discussion. The reasons for including polling were threefold: to allow students an opportunity to anonymously benchmark their progress against other students by answering practice MCQs (Gillett-Swan, 2017); to receive further feedback on their acquisition of key concepts and enable discussion of their responses with the teacher and their peers to deepen learning through peer teaching (Stigmar, 2016); and to access the teacher’s functioning knowledge.
Quantitative Data
Student Commitment
The Brightspace by D2L LMS has the capacity to provide data on student activity within the site. Student activity data were extracted for 2017 and 2018. All data were de-identified to protect privacy. For each student, the following variables were determined: (1) time (in minutes) spent in the resources section of the subject website, (2) total number of mouse clicks on topics within the resources section, (3) discussion posts read, expressed as a percentage of total number of discussion posts, and (4) discussion posts authored, also expressed as a percentage. These values were used as a proxy for student commitment prior to and after the redevelopment of the subject. An independent samples t-test was used to determine any significant differences between student commitment in 2017 and 2018.
Student Performance
Student performance for each of the assessment tasks (which remained the same from 2017 to 2018) was de-identified, weighted and averaged (final subject grade), and compared from 2017 to 2018 using an independent samples t-test. The relationship between student performance and each of the measures of commitment described above was analyzed using a Pearson’s correlation.
Student Satisfaction
Data were drawn from eVALUate, the Deakin University student evaluation system. With the intention of measuring student engagement and satisfaction with learning and the subject, eVALUate is a 5-item scale anchored from strongly agree to strongly disagree, and provides scores against 10 statements and an overall subject score (see Table 1). Student responses were used to determine and compare the level of student satisfaction from 2017 to 2018. Because these data are anonymously collected and provided in an aggregated form, no further analysis could be run.
Table 1. Student satisfaction percentage scores (%) against 11 eVALUate items for the students in 2017 (N = 134) and 2018 (N = 153).
Learning Experiences
Due to the study design, only the 2018 cohort was eligible to participate in an online survey that aimed to compare student satisfaction between current online learning methods and previous online learning methods made up of an audio-visual recording of a 45 min narrated slide show. Due to the structure of the course, each student currently was or had been exposed to the 45 min narrated slide show format in the past 12 months for another subject. Demographic data collected included gender, age, first language, educational background, current degree, estimated study time, and employment time. Students were asked to grade their current online learning experience compared to their previous or concurrent experiences of an audio-visual recording of a 45 min narrated slide show. Finally, students were asked to assign a satisfaction score (from 1 to 10, 10 being the highest) to each online learning activity in the revised program comprising scaffolded, sequenced and guided learning activities including short narrations of key concepts, readings, videos, interactive pages, and self-assessment questions as described earlier.
Qualitative Data
For the qualitative part of this study, an extended response questionnaire was used to explore student perceptions of elements of the new subject learning design. Students were asked to reflect about their personal learning experience in this subject, including their perceptions of delivery mode, the requested study time commitment, learning activities, and the format of the seminars. Questions were adapted and derived from previously published research (Currey et al., 2015). Examples of the questions asked are provided in Table 2. The survey was conveyed via the Qualtrics online platform during the final 3 weeks of the teaching trimester. The full survey can be found in Supplementary Material.
Data Analysis
Demographic data were analyzed using descriptive statistics. Quantitative data were analyzed using descriptive statistics and parametric tests including paired t-tests, one-way ANOVA and Pearson’s correlations. Data are presented as mean ± SD. SPSS statistics 24 was used to conduct all statistical tests, and the value for significance was set at p < 0.05.
Qualitative data were analyzed using thematic analysis techniques in order to gain a greater understanding of student perceptions of the benefits of the new mode of delivery of this subject. Thematic analysis, a method designed at identifying, analyzing and reporting themes or patterns within data, was conducted according to the principles stated by Braun and Clarke (Braun and Clarke, 2006). Briefly, these principles rely on a 6-step analysis process of: 1. Familiarizing yourself with the data, 2. Generating initial codes, 3. Searching for themes, 4. Reviewing themes, 5. Defining and naming themes, 6. Producing the report. Constant comparative techniques were applied to analyze for overlap, redundancy, emergence of any new themes and relationships between themes.
Results
Student Performance and Commitment in 2017 and 2018
Student overall performance (final subject grade) did not differ between 2017 (uploaded classroom recordings; subject grade = 68.14 ± 13.01) and 2018 (self-directed, guided learning; subject grade = 66.47 ± 12.79) (p > 0.05). Student commitment was assessed as a combination of time spent in the resources section, total number of clicks in the resources section, percentage of discussion posts read and percentage of discussion post authored. None of these parameters significantly differed between 2017 and 2018 (all p > 0.05), suggesting that students’ commitment was not influenced by the new format of the subject (Table 3). In both 2017 and 2018, the best predictor of overall students’ performance was their involvement in the discussion board. Number of discussion posts read were weakly correlated to overall performance, with Pearson’s correlations returned significant, r(132) = 0.268, p < 0.01 in 2017 and r(151) = 0.197, p < 0.05 in 2018, respectively. Similarly, number of discussion posts authored were weakly correlated to overall performance, with Pearson’s correlations returned significant, r(132) = 0.261, p < 0.01 in 2017 and r(151) = 0.175, p < 0.05 in 2018, respectively.
Student Satisfaction in 2017 and 2018
Student responses to the eVALUate items indicate satisfaction was higher in 2018 compared to 2017. The largest improvements in student satisfaction from 2017 to 2018 were observed for item 11 “overall subject satisfaction” (increase from 53 to 83%), item 6: “workload” (increase from 55 to 86%) and item 7: “teaching quality” (increase from 42 to 83%). Item 2 “learning experiences” (increase from 61 to 86%), and item 3 “learning resources” (increase from 76 to 86%) were identified as the most likely to be reflective of changes in the content delivery mode. Data for each of the eVALUate items are shown in Table 1.
Demographics of the 2018 Student Cohort
Out of 153 post-graduate students enrolled in 2018, 75 also agreed to complete the online survey (response rate: 49%). Of the students, 84% were female. The most represented age cohort was 25–34, and the students’ first language was predominantly English. About one third of the students had an undergraduate degree in biomedical, health or exercise sciences, and about a quarter of them had an undergraduate degree in nutrition or food sciences. Before enrolling into their current postgraduate degree, 70% of students had last studied at university between 2011 and 2017. In contrast, 12% had not studied at university since 2000 or earlier, or at all. Twenty-three percent of the students did not do any paid work, or were retired, whilst 27% of them worked full-time (35 h per week or more). These data can be seen in Supplementary Table 1.
Subject Enrollment Status and Self-Reported Study Commitment
Students were mostly enrolled in Master degrees (60%) (see Supplementary Table 2). Most students reported that they committed up to 9 (37%) or between 10 and 19 (31%) hours of work per week to this specific subject. Students predominantly attended the online (47%) or on-campus seminars (27%).
Subject Delivery Mode
For all demographic groups, the new delivery mode was preferred to an audio-visual recording of a 45-min narrated slide show (p = 0.001). Gender, first language, educational background, current degree, and professional commitment did not influence the way the students perceived any component of the subject. The self-reported time spent on the subject website weekly was negatively correlated to the overall score given to the subject (p = 0.048), suggesting that students allocating less study time to this subject may find more benefits in the new format. This idea of flexible access to learning material and time efficiency was consistent across the qualitative responses. For example, student reported “I really appreciate I can watch it when I am ready and have the time to watch” or “I have found this format much more effective. It allows me to focus in smaller concentrated bursts on the material, as time allows me.” This was specifically emphasized by students working full time: “Because I work full time I find the flexible delivery much more user friendly as it allows me to study from home around my current commitments.” While the concept of flexible learning is intrinsic to online programs, offering content in small packages of different formats seemed even more beneficial to those having less study time to allocate to this subject.
Learning Activities
In terms of meeting the subject learning outcomes, readings were perceived as the less effective activity (mean satisfaction score = 7.2/10), which still indicated a moderately high satisfaction level. The short narrations of key concepts scored 7.4/10 closely followed by the interactive links to external videos and websites at 7.5/10. Self-assessment questionnaires were more highly rated with a mean of 7.9/10.
Readings
The readings were perceived as effective for learning by most students (mean satisfaction score = 7.2/10), as reflected in 43 positive responses compared to only 10 negative responses. Students found that readings helped to consolidate their learning and sometimes provided a more detailed explanation of a topic. Students appreciated the ability to read at their own pace and reinforce important messages from the key concepts. For example, students reported that the readings were “particularly useful, they help me to consolidate my understanding and to go at my own pace” and that they “effectively reinforce the material as well as giving access to revision or more elaborate explanations.” Students who found the readings ineffective for their learning typically identified themselves as visual learners, preferring to “watch the videos or interact with others.”
Short Narrations Delivering Concepts
The short narrations delivering key concepts were increasingly well scored (mean satisfaction score = 7.4/10) with students of an increasing age (p = 0.029). This effect was also reflected by a negative correlation between the score given to this particular activity and the last time the student had studied at university (p = 0.049). Students found the short narrations to be effective in maintaining their concentration and focus compared to standard 45 min lectures. This was reflected by 49 positive responses indicating that short narrations were effective for learning compared to only 13 negative responses. Students regularly commented that these narrations were “concise, relevant, well-structured” and “extremely effective because they are quick and to the point, making it easy to retain the information and easy to find the time to watch them.”
Interactive Activities and External Video Links
Interactive activities and external video links were the second-most preferred activity for learning (mean satisfaction score = 7.5/10). Forty-eight responses indicated that the interactive activities or videos were effective for learning, compared to only seven responses suggesting these activities were ineffective. Many students perceived the interactive activities and videos provided a good supplement to their learning by giving them a different perspective of the subject content. One student commented that the interactive activities and videos “put what we are learning into perspective. I am a very visual learner and this helped me grasp most concepts a lot more and have a lot more of an understanding.” The students acknowledged that learning content from a different viewpoint (than what is presented by the lecturer) can help them to better incorporate key ideas into their professional practice. In contrast, some students felt frustrated that they were directed to external videos and activities to achieve learning outcomes rather than being taught by academic staff. However, students consistently reflected that the interactive activities and videos made studying, “easy and fun,” supporting the idea that students value enjoyment in learning, and may better retain information when employing active learning, in comparison to passive learning.
Self-Assessment Questions
The self-assessment questions aimed at providing formative feedback were judged the most useful activity overall (mean satisfaction score = 7.9/10). The more recent the students had attended university for previous study, the less highly they graded the opportunity for formative feedback (p = 0.042) in this subject. Fifty-four student responses for the self-assessment questions were coded as effective for learning, compared to only seven responses that were coded as ineffective for learning. Students consistently explained that these activities provided feedback about whether they had understood a topic concept, with one student reporting “I find these to be extremely helpful. They’re a great way to self-assess and to ensure you not only know the content but understand how it applies to different concepts.” Students were enthusiastic about receiving feedback that could provide an indication of their progress and highlight where they misunderstood concepts or required further revision. Several students also suggested that the self-assessment questions “helped [them] gain confidence” before undertaking graded assessments.
Seminars
Three quarters of students (74%) responding to the survey reported attending synchronous seminars (face to face or remotely), while the remaining quarter elected to listen to the recordings asynchronously (21%) or did not engage in the seminars at all (5%). Students reported that the seminars were helpful for their learning, with 48 positive responses and 18 negative responses coded. Students felt the ability to engage with teaching staff and other students was a positive learning experience. For example, students reported that the seminars “helped to feel included with the cohort and hear others discuss aspects of the content.” Students valued personal interaction and real-time feedback that they could get from academic staff. Negative responses commonly described that the seminars were “quite long” or that “more detail was needed” in the information delivered. It may be therefore important to ensure that the aims of each learning activity are outlined to students to ensure that their expectations are met. Live polling was a key mode of learning that was implemented in the seminars, allowing students to anonymously respond to online questions using their mobile phones, with results displayed to stimulate class discussion. Thirty-two responses were coded for students perceiving the live polling as effective and engaging, whereas 18 responses suggested that the live polling was not an effective or engaging mode of learning. Students who disliked the live polling were commonly those who did not attend the seminar, and rather listened to a recording of the seminar. For example, students reported, “I find that listening to the seminars after they’ve happened difficult as they’re often interactive and much of the time is spent waiting for other students to respond/engage” or “when I watch the recordings, there have been times where I have not understood the answer yet I can’t do anything about it.” In contrast, students who attended the seminars were more engaged by the live polling and reported that “the live questionnaires are more engaging than a standard seminar as it is fun and we are more focused” and “I liked these as it provided a chance to test knowledge without pressure and then talk through the answers. This really helped my learning.” In sum, those who engaged during real time/synchronous opportunities reported to have gained more learning than those who passively watched recordings afterward.
Discussion
In this study, we evaluated the comprehensive redevelopment of the online learning environment of a post-graduate subject nested within the Deakin University’s post-graduate Human Nutrition course. Students who enroll in this course complete an online, non-vocational pathway at the Graduate Certificate, Graduate Diploma or Masters level. The main finding of this study was that an interactive and guided, self-directed delivery of the learning content was perceived as more effective in meeting the learning outcomes than uploaded, pre-recorded lectures. While this overall positive response was supported by student satisfaction metrics, students did not perform better in terms of final grade, nor did the results suggest that they committed more time to their studies.
As part of broader university modeling, course students were surveyed in 2018 about their study intentions. Students reported pursuing post-graduate studies out of personal interest (27%), but also because they aspired to a career change (24%), and believed the course would advance their career (12%) (Deakin University Marketing Division, 2018). Of importance, an overwhelming majority of students (86%) chose to enroll for this course because of the online delivery mode (Deakin University Marketing Division, 2018). Yet, in contrast to our hypothesis, students did not perform better nor committed more time to their studies in the new online learning environment. This interesting finding should be discussed in the light of the recent results reported by Deslauriers et al. (2019). When comparing the perception of learning in response to the same content being taught using passive or active instruction, these authors found that students’ perception of learning was poorer in the active learning setting. Deslauriers et al. (2019) attributed this finding to the increased cognitive load that is inherent to active learning and suggested that it may impair student motivation, commitment, and willingness to further engage in their learning. Our student cohort, especially the older students, had primarily experienced passive learning during their earlier studies in the curriculum. Due to our study design, it was not possible to directly compare the perception of passive and active delivery of the same content. It can, however, be postulated that students who are novice to active learning might interpret this increased cognitive effort as a sign that their learning is not effective, which may prevent them to want to learn more and invest more time in their studies. Taken together, our results suggest that the active format conveyed the content in a way that was as effective, but not more effective than in the past. On the other hand, we found that the new online learning environment was significantly more enjoyable, motivating, and perceived as more time efficient. This highlights the importance of preparing the students for active leaning and to provide a clear and smooth flow of explanations allowing the students to easily navigate the learning design and content (Deslauriers et al., 2019). In these conditions, active and interactive learning experiences can be used to provide postgraduate students choosing online learning with a greater sense of satisfaction.
In terms of individual learning activities, we observed a clear gradation from active/interactive activities (self-assessment questions, interactive activities, and external video links) being preferred by the students, to passive activities (short narrations delivering key concepts and readings) being found less effective. This confirms that in a context where students are mostly isolated and do not get many opportunities to apply their knowledge, activities involving some level of active engagement should be promoted (Prince, 2004) despite the increased cognitive load associated with active learning (Deslauriers et al., 2019). The idea of using multiple learning activities presented in alternative formats improves the online learning experience (Chita-Tegmark et al., 2011; Moorefield-Lang et al., 2016; Fidaldo and Thormann, 2017) and provides the opportunity for students to focus on content resources that are most suitable for their preferred style of learning. Structuring content delivery modes to suit all learning preferences may enhance student engagement and retention of knowledge (Hawk Thomas and Shah Amit, 2007). As nicely summarized by one of our participants “I like to listen plus read. The more senses [I can use] the better.” Figure 1 summarizes the perceived advantages and disadvantages of the different learning activities in the context of our study.
Figure 1. Summary of findings. Advantages and disadvantages of learning activities in an online higher education subject.
Social isolation, absence of peer interaction, lack of social cues and lack of benchmarks are issues that are common to online programs (Croft et al., 2010; Gillett-Swan, 2017). Some of the improvements implemented in this subject may have started to address these issues. The live polling tool used in the seminars typically gave the students an opportunity to benchmark themselves against other students and to participate in the discussion regardless of their individual outcome. As such, this teaching strategy may not only have reinforced students’ self-confidence, but also made the seminars more engaging and triggered interaction with other students. Even in an anonymous format, students found the active participation in live polling gave them a “sense of belonging and contributing,” which they may otherwise have not experienced in a passive learning environment. Being able to create a sense of sense of community where peers provide constructive feedback in a supported social context is recognized as one of the most significant challenges associated with implementing successful online courses (Desai et al., 2008). Our findings as well as other studies confirmed that students value social exchanges more than any other aspect of their online courses (Boling et al., 2012), and that peer learning should be prioritized when possible (Broadbent and Poon, 2015). This corroborates the idea that collaborating and learning from peers is integral to students’ learning and performance. The anonymous nature of the poll questions followed by an open discussion may also have helped to overcome some of the personal barriers that online students regularly report to experience in collaborative learning tasks, including anxiety, insecurity and feeling out of their comfort zone (Roberts and Joanne, 2007; Hill et al., 2009; Gillett-Swan, 2017). Indeed, students appreciated this ability to engage with the class in an anonymous format, whereby any incorrect answers they provided were not met with embarrassment. Not surprisingly, the students who did not find any benefit in the live polling were those who did not attend the online or on-campus seminars, but rather listened to the recordings later on. This constitutes a known limitation of these types of activities (Stoessel et al., 2015; Gillett-Swan, 2017), and despite offering several real time synchronous seminar times during the week, personal factors including caring for children or professional commitments can consistently impact online postgraduate students abilities to access and participate in live sessions (Stoessel et al., 2015).
Age, which also positively correlated to the criteria last time (students) attended university, was the main variable explaining students’ appraisal of the different activities whereas, for the second part of our hypothesis, individual academic background did not account for any of the variability. Students having only experienced face-to-face lectures before typically found more value in the 10-min narrations of key concepts and often asked for more, or for reinforced presence of the lecturer, an outcome that can be achieved in several ways (Lehman and Conceição, 2010). Previous research however suggests that students’ attention span in lectures lasts approximately 15 min (Prince, 2004), and thereafter, focus and retention of information declines. Through various references to being time poor, some students highlighted that short narrations enabled them to find the time to address the learning outcomes. Still, others found short narrations less useful due to the lack of depth in learning. These notions highlight the importance of reiterating to students that the key learning outcomes cannot be fully attained with a 10 min narration and that engaging with the other activities will allow them to supplement their learning and indulge in greater depth of learning, where required. Consistent with designing online learning, it was expected students would engage in multiple ways with various materials (Laurillard, 2013; Stone, 2016). The older age cohort also more strongly relied on the self-assessment questions, especially as a way to gain confidence before undertaking graded assessments. This suggests that the lack of recent experience at university may generate a lack of confidence about current expectations and standards but also highlights the value placed on formative feedback. Non-experienced students are also poor at assessing their own learning, especially in the non-traditional context of online learning (Deslauriers et al., 2019). This concern can be partly addressed by clear explanations about what is expected of students in the subject, and what to expect from the teacher, along with providing the students with early and frequent opportunities to test their knowledge prior to undertaking graded assessment tasks. To maximize the benefit of formative feedback on their learning acquired thus far, both forms—individualized or anonymous—are valuable. Increasing students’ self-esteem is particularly important for students enrolled in online subjects who have limited interaction with other students for support and feedback (Gillett-Swan, 2017) and out results suggest that the form of feedback we used was appropriate and constructively aligned (Hattie and Timperley, 2007; Biggs, 2015).
Limitations of this study include that, in contrast to the work by Deslauriers et al. (2019) for example, it was not possible to qualitatively compare the perception of passive and active delivery of the same content. Instead, we used another subject concurrently undertaken by the students and presented as a 45-min, narrated slide show, as a surrogate for an experience of passive delivery. In contrast, a strength was that we were able to directly compare metrics of students’ commitment, satisfaction and performance because neither the learning content nor the assessment tasks changed from the old to the new version of the subject. Because student satisfaction data are anonymously collected and provided in an aggregated form, we could not analyze the association between student satisfaction and student commitment and performance across the 2017 and 2018 versions of the subject. The same limitation applies to our qualitative data, where no linkage was possible between satisfaction indices and appraisal of the different activities and of the subject in general. Paired analyses would allow to start teasing out the mechanisms responsible for a general increase in satisfaction that was not paralleled with a general increase in commitment or performance and, when possible, is recommended in future research.
Implications for future research stem from our main observation that post-graduate student satisfaction, enjoyment, and sense of inclusion can be improved by redeveloping online subjects into an active form that demands cognitive engagement, but that this may not be enough to improve their performance and commitment. Further research opportunities abound for developing a stronger evidence base for best practice in online learning design and delivery that combines cognitive, emotional and sensory engagement. In the current context of a world pandemic, online learning is becoming the new normal, and many universities may only rarely offer in person teaching in coming years. Educators have been placed under unprecedented pressure to quickly design and develop new online learning experiences. In contrast to the current conditions of remote emergency learning implemented during COVID-19, our study conducted before the pandemic was carefully planned, evidence-based and appropriately resourced. Online education design based on best practice was however not enough to improve students’ results or increase the time invested in their studies, a result that might be attributed to the increased cognitive load associated to online learning. Further research is required to understand what constitutes the optimal forms and processes of both technical and learning design teams to provide support to content expert teachers when designing online learning experiences. While the advantages of online learning are multiple, valuable online learning experiences require more than the provision of available material and efficient access. Creating a sense of social connectiveness to foster engagement and ensure deep learning can be achieved through active and collaborative learning. However, even when aligning with best practice, course designers should carefully consider the cognitive load associated with active learning and be clear and explicit about their learning design as it may negatively impact student results or commitment levels. This is especially true in the post-graduate context where students are traditionally older and more likely to have limited experience of online learning.
Conclusion
In conclusion, redeveloping an online subject from an uploaded classroom-based recorded lecture to an interactive and guided, self-directed mode of delivery presents multiple advantages in terms of student satisfaction, motivation and enjoyment without impacting the student results or commitment levels. This might be attributed to the increased cognitive load associated with active learning online. A more active and interactive mode of delivery provides the students with a greater sense of inclusion while accommodating the diversity of the cohort and its different learning preferences.
Data Availability Statement
All datasets generated for this study are included in the article/ Supplementary Material, further inquiries can be directed to the corresponding author.
Ethics Statement
The studies involving human participants were reviewed and approved by the Deakin University Human Ethics Advisory Group (HEAG-H 2018-057). Written informed consent for participation was not required for this study in accordance with the national legislation and the institutional requirements.
Author Contributions
SL, IS, and JC designed the study. SL and OK collected the data. OK, AH, and SL processed and analyzed the data. SL, OK, and JC wrote the manuscript. All authors edited the manuscript and proof-read the final version.
Conflict of Interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Supplementary Material
The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2020.598560/full#supplementary-material
Supplementary Table 1 | Demographic Data of the 2018 student cohort (N = 75).
Supplementary Table 2 | Subject Enrollment and Study Commitment of the 2018 student cohort (N = 75).
Footnotes
References
Akcaoglu, M., and Lee, E. (2016). Increasing social presence in online learning through small group discussions. Int. Rev. Res. Open Distrib Learn. 17, 1–17. doi: 10.19173/irrodl.v17i3.2293
Australian Bureau of Statistics (2012). Higher Education. Available online at: http://www.abs.gov.au/ausstats/abs@.nsf/Lookup/by%20Subject/1301.0~2012~Main%20Features~Higher%20education~107 (accessed May 24,2012)
Biggs, J. T. C. (2015). “Constructive alignment: an outcomes-based approach to teaching anatomy,” in Teaching Anatomy, eds W. Pawlina and L. K. Chan (Cham: Springer). doi: 10.1007/978-3-319-08930-0_4
Boling, E. C., Hough, M., Krinsky, H., Saleem, H., and Stevens, M. (2012). Cutting the distance in distance education: perspectives on what promotes positive, online learning experiences. Int. Higher Educ. 15, 118–126. doi: 10.1016/j.iheduc.2011.11.006
Bonwell, C. C., Eison, J. A., and Aehe Staff. (1991). Active Learning: Creating Excitement in the Classroom. New York, NY: Wiley.
Braun, V., and Clarke, V. (2006). Using thematic analysis in psychology. Qual. Res. Psychol. 3, 77–101. doi: 10.1191/1478088706qp063oa
Broadbent, J., and Poon, W. (2015). Self-regulated learning strategies & academic achievement in online higher education learning environments: a systematic review. Int. Higher Educ. 27, 1–13. doi: 10.1016/j.iheduc.2015.04.007
Casey, D. M. (2008). The historical development of distance education through technology. TechTrends 52, 45. doi: 10.1007/s11528-008-0135-z
Chita-Tegmark, M., Gravel, J. W., Serpa, B., Domings, Y., and Rose, D. H. (2011). Using the universal design for learning framework to support culturally diverse learners. J. Educ. 192, 17–22. doi: 10.1177/002205741219200104
Cohen, L., Manion, L., and Morrison, K. (2013). Research Methods in Education. Milton Park: Taylor & Francis. doi: 10.4324/9780203720967
Croft, N., Dalton, A., and Grant, M. (2010). Overcoming isolation in distance learning: building a learning community through time and space. J. Educ. Built Environ. 5, 27–64. doi: 10.11120/jebe.2010.05010027
Currey, J., Eustace, P., Oldland, E., Glanville, D., and Story, I. (2015). Developing professional attributes in critical care nurses using Team-Based Learning. Nurse Educ. Pract. 15, 232–238. doi: 10.1016/j.nepr.2015.01.011
Desai, M. S., Hart, J., and Richards, T. C. (2008). E-learning: paradigm shift in education. Education 129, 327–334.
Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., and Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. U.S.A. 116, 19251–19257. doi: 10.1073/pnas.1821936116
Fetherston, A. (2001). Pedagogical Challenges for the World Wide Web, Vol. 9, Greenville, NC: ECU Publications.
Fidaldo, P., and Thormann, J. (2017). Reaching students in online courses using alternative formats. Int. Rev. Res. Open Distrib. Learn. 18, 139–161. doi: 10.19173/irrodl.v18i2.2601
Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., et al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. U.S.A. 111, 8410–8415. doi: 10.1073/pnas.1319030111
Gillett-Swan, J. (2017). The challenges of online learning: supporting and engaging the isolated learner. J. Learn. Des. 10, 20–30. doi: 10.5204/jld.v9i3.293
Greenland, S., and Moore, C. (2014). Patterns of online student enrolment and attrition in Australian open access online education: a preliminary case study. Open Praxis 6, 45–54. doi: 10.5944/openpraxis.6.1.95
Hattie, J., and Timperley, H. (2007). The power of feedback. Rev. Educ. Res. 77, 81–112. doi: 10.3102/003465430298487
Hawk Thomas, F., and Shah Amit, J. (2007). Using learning style instruments to enhance student learning. Decision Sci. J. Innov. Educ. 5, 1–19. doi: 10.1111/j.1540-4609.2007.00125.x
Hill, J. R., Song, L., and West, R. E. (2009). Social learning theory and web-based learning environments: a review of research and discussion of implications. Am. J. Distance Educ. 23, 88–103. doi: 10.1080/08923640902857713
Hostetter, C., and Busch, M. (2012). Measuring up online: the relationship between social presence and student learning satisfaction. J. Scholarsh. Teach. Learn. 6, 1–12.
Jancey, J., and Sharyn, B. (2013). Institutional factors and the postgraduate student experience. Qual. Assur. Educ. 21, 311–322. doi: 10.1108/qae-nov-2011-0069
Kinash, S. (2019). How to Design & Deliver Quality Online Education. Available online at: https://educationtechnologysolutions.com/2019/06/how-to-design-deliver-quality-online-education/ (accessed July 7, 2020).
Laurillard, D. (1998). Multimedia and the learner’s experience of narrative. Comput. Educ. 31, 229–242. doi: 10.1016/S0360-1315(98)00041-4
Laurillard, D. (2013). Teaching as a Design Science: Building Pedagogical Patterns for Learning and Technology. Milton Park: Taylor & Francis. doi: 10.4324/9780203125083
Lehman, R. M., and Conceição, S. C. (2010). Creating A Sense of Presence in Online Teaching: How to” be There” for Distance Learners, Vol. 18, Hoboken, NJ: John Wiley & Sons.
Marketing Division (2018). Master of Human Nutrition student journey. Melbourne, VIC: Deakin University.
Moorefield-Lang, H., Copeland, C. A., and Haynes, A. (2016). Accessing abilities: creating innovative accessible online learning environments and putting quality into practice. Educ. Inf, 32, 27–33. doi: 10.3233/Efi-150966
Prince, M. (2004). Does active learning work? A review of the research. J. Eng. Educ. 93, 223–231. doi: 10.1002/j.2168-9830.2004.tb00809.x
Puzziferro, M. (2008). Online technologies self-efficacy and self-regulated learning as predictors of final grade and satisfaction in college-level online courses. Am. J. Distance Educ. 22, 72–89. doi: 10.1080/08923640802039024
Puzziferro, M., and Shelton, K. (2008). A model for developing high-quality online courses: integrating a systems approach with learning theory. J. Asynchronous Learn. Netw. 12, 119–136. doi: 10.24059/olj.v12i3.58
Richardson, J. C., and Swan, K. (2019). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. J. Asynchronous Learn. Netw. 7, 68–88. doi: 10.24059/olj.v7i1.1864
Roberts, T. S. M., and Joanne, M. (2007). Seven problems of online group learning (and their solutions). Educ. Technol. Soc. 10, 257–268.
Shute, V. J. (2008). Focus on formative feedback. Rev. Educ. Res. 78, 153–189. doi: 10.3102/0034654307313795
Stigmar, M. (2016). Peer-to-peer teaching in higher education: a critical literature review. Mentor. Tutoring Partnersh. Learn. 24, 124–136. doi: 10.1080/13611267.2016.1178963
Stoessel, K., Ihme, T. A., Barbarino, M.-L., Fisseler, B., and Stürmer, S. (2015). Sociodemographic diversity and distance education: who drops out from academic programs and why? Res. Higher Educ. 56, 228–246. doi: 10.1007/s11162-014-9343-x
Sun, A., and Chen, X. (2015). Online education and its effective practice: a research review. J. Inf. Technol. Educ. Res. 15, 157–190. doi: 10.28945/3502
Tharayil, S., Borrego, M., Prince, M., Nguyen, K. A., Shekhar, P., Finelli, C. J., et al. (2018). Strategies to mitigate student resistance to active learning. Int. J. STEM Educ. 5:7. doi: 10.1186/s40594-018-0102-y
University Rankings Australia. (2017). Number of Students Studying Online. Available online at: http://www.universityrankings.com.au/distance-education-numbers.html (accessed July 7, 2020).
Keywords: online learning, online education, post-graduate learning, active learning, subject redevelopment
Citation: Lamon S, Knowles O, Hendy A, Story I and Currey J (2020) Active Learning to Improve Student Learning Experiences in an Online Postgraduate Course. Front. Educ. 5:598560. doi: 10.3389/feduc.2020.598560
Received: 25 August 2020; Accepted: 13 October 2020;
Published: 03 November 2020.
Edited by:
Canan Blake, University College London, United KingdomReviewed by:
Ana Remesal, University of Barcelona, SpainJon Mason, Charles Darwin University, Australia
Copyright © 2020 Lamon, Knowles, Hendy, Story and Currey. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Séverine Lamon, c2V2ZXJpbmUubGFtb25AZGVha2luLmVkdS5hdQ==