Skip to main content

MINI REVIEW article

Front. Educ., 25 January 2022
Sec. Assessment, Testing and Applied Measurement
This article is part of the Research Topic Online Assessment for Humans: Advancements, Challenges and Futures for Digital Assessment View all 8 articles

Applying Learning Analytics in Online Environments: Measuring Learners’ Engagement Unobtrusively

  • Empirical Learning Sciences, University of Passau, Passau, Germany

Prior to the emergence of Big Data and technologies such as Learning Analytics (LA), classroom research focused mainly on measuring learning outcomes of a small sample through tests. Research on online environments shows that learners’ engagement is a critical precondition for successful learning and lack of engagement is associated with failure and dropout. LA helps instructors to track, measure and visualize students’ online behavior and use such digital traces to improve instruction and provide individualized support, i.e., feedback. This paper examines 1) metrics or indicators of learners’ engagement as extracted and displayed by LA, 2) their relationship with academic achievement and performance, and 3) some freely available LA tools for instructors and their usability. The paper concludes with making recommendations for practice and further research by considering challenges associated with using LA in classrooms.

Introduction

Data-driven Decision-Making (DDDM) requires the instructors to actively adapt their instruction to evidence gathered in multiple-format from multiple sources. Digital environments such as LMS and MOOCs generate a large amount of data- Big Data- about students’ learning, performance and engagement activities called ‘digital traces’. Since students’ engagement is reported to be positively associated with learning and productivity, Learning Analytics (LA) can be used to collect and analyze these digital traces to track, measure and visualize students’ engagement patterns with materials/content, instructors, and peers in real time (during the learning process). Teachers can use data from LA to guide their pedagogical decisions, improve instruction and provide customized support, i.e., individualized feedback. This paper aims at exploring the complex, multidimensional nature of engagement as well as the use of LA in quantifying and measuring the relationship between engagement and learning outcomes. Furthermore, some freely available LA tools, their usability and challenges will be presented. After a review of engagement conceptualizations, the remainder of the paper aims to delineate the following questions:

Q.1. What counts as engagement in online learning environments?

Q.2. How is online engagement related to achievement and academic performance?

Q.3. Which LA dashboards are freely available to instructors?

Q.4. What are the challenges in using LA dashboards?

This review is neither exhaustive nor conclusive. The author adopted an integrative, five-phase review method suggested by Whittemore and Knafl (2005), namely 1) inclusion criteria, 2) problem identification, 3) literature search, 4) data evaluation, 5) data analysis and presentation of instructional principle suggested. As the first step, articles on “students engagement,” “learning analytics,” “conventional and online courses,” and ‘higher education’ were considered as eligible candidates to be included. The main identified problem was “inadequacy of traditional evaluation methods in measuring students engagement.” Literature search was first performed on three journals related to education and educational technology: Educational Technology & Society, Journal of Educational Research, and British Journal of Educational Technology. Later, a so-called cross-referencing was conducted: the reference sections of relevant articles were consulted to find further articles. About 84 articles were selected and put into a quick review (abstracts only). Eventually, a total of 37 articles were scanned thoroughly.

Students’ Engagement: A Multidimensional Construct

Over the past 30 years, educational research has shown a growing interest in understanding “engagement.” Engagement, as a form of involvement and commitment, is defined as the “effort, time and energy students invest in academic experience and educational activities linked to desired learning outcomes” (Astin, 1984; Kuh, 2009). Engagement is considered to be the “holy grail” of learning (Sinatra et al., 2015) which plays a critical role in the development of deep and effective self-regulated learning. While early research was mainly concerned with reducing drop-out (Finn and Voelkl, 1995), there is a shift towards understanding and describing engagement as a “process” and its relation to achievement. There are several models or frameworks aiming at conceptualization of engagement. Previous studies (Marks, 2000) conceptualized student engagement as a unidimensional construct which can be measured by observable behaviors: i.e., attendance, participation, psychological effort. However, current literature favors a multidimensional conceptualization of engagement, e.g., Reschly and Christenson (2012) defined engagement, both as a product and a process, with four dimensions: academic, behavioral, affective, and social. Fredricks et al. (2004) suggested a three-component framework: behavioral, cognitive, and emotional engagement. Behavioral engagement refers to active physical participation, i.e., attending classes, completing the course, note-taking or asking questions. Cognitive engagement describes mental involvement in understanding, willingness to master a difficult task and task management approaches, such as focused attention and problem-solving strategies. Emotional engagement includes affective factors, i.e., being interested in a topic, enjoying the learning experience, flow and sense of belonging to a community.

Considering the positive relationship between engagement and learning outcomes, i.e., Morris et al. (2005) reported that it can account for 31% of variance in gain score, several studies consider using engagement as a viable predictor of academic achievements (Lee, 2014). Traditionally, engagement was measured through 1) self-report questionnaires, 2) interviews, 3) observational checklist and 4) teacher’s informal observation of gesture and non-verbal features (i.e., boredom or frusteration). There are several validated, empirical surveys to capture students’ engagement, i.e., Classroom Survey of Student Engagement (Ouimet and Smallwood, 2005) or Student Course Engagement Questionnaire (Handelsman et al., 2005). Majority of survey studies show when students have more frequent interaction with instructor, content, and peers, they report higher satisfaction, more meaningful participation and higher-order learning (Eom et al., 2006). However, survey data mostly suffer from validity issues, i.e., social desirability, unstable moods, response biases, etc. Collecting interview data is also excessively costly, time-consuming and impractical in large courses. Although observational checklists, mostly conducted by an external observer, contain some objective measures of engagement (i.e., student-initiated questions), they demand a great deal of skill, effort and time from observers. Lastly, teacher’s informal observation, although an important source of data to make decision “on the fly,” is criticized due to its unreliability and subjectivity of judgement.

Students’ Engagement in Online Environment

Advances in technology paved the way to the automatic generation, collection and analysis of educational data. The advent of digital or virtual learning environment (VLE), such as Learning Management Systems (i.e,Moodle or Blackboard), Massive Open Online Courses (MOOCs), Open Course Ware (OCW), learning applications, Intelligent Tutoring Systems (ITS), E-book portals, online games, simulations, social networks (i.e., YouTube, Twitter), Open Educational Resources (OER), as well as wearables, sensors, Internet of the Things (IoT), not only provide the users with advantages such as openness, flexibility, autonomy, accessibility, and collaboration, they also led to the exponential growth of big data: large, heterogeneous, multi-modal, multi-sources educational data. Digital environments and big data generate (new forms of) evidence, so-called “digital traces,” about students’ learning, performance and engagement activities. These “digital traces” that students leave behind as they are involved in the learning process are anaylzed with technologies such as Educational Data Mining (EDM) and Learning Analytics (LA).

Siemens (2012) defined LA as the “measurement, collection, analysis and reporting of data about learners and their contexts for purposes of understanding and optimizing learning and the environment in which it occurs.” LA uses of extracted user data or actual learning activities in real time provides insight into engagement, motivation, performance and learning in an ‘unobtrusive and ubiquitous’ way (Bodily and Verbert, 2017). Such trace data are better indicators of learners’ engagement compared to self-report or observational data. Earlier uses of LA-generated data on engagement aimed at predicting dropout and failure through early alert systems (i.e., StepUp dashboard). Currently, the focus is more on providing just-in-time, formative feedback to teachers (and students), so that they can adjust the content and personalize the delivery mode of instruction.

Engagement Metrics and Achievement

Several studies discussed the question “what should count as desirable metrics of engagement and participation in VLE?” (DeBoer et al., 2014). Existing literature shows various operationalizations of online engagement. For example, Kovanović et al. (2019) examined different aspects of students’ engagement, i.e., course access, course navigation, number of recorded lectures watched, discussion, or completed assignments. Kahan et al. (2017) focused on the “frequency of involvement”: how many times video lectures are downloaded, embedded questions answered, exams/quizzes submitted, thread views/opened, posts/comments written in the discussion forums, etc. Below, we summarize some engagement indicators reported by existing literature (Aluja-Baneta et al., 2019; Deng and Benckendorff, 2017; Deng et al., 2020; Joksimovi et al., 2018; Lee et al., 2021). It should be noted that this list is not exhaustive and although being useful, some metrics are difficult to interpret: i.e., whether an unfinished task or exercise is an evidence of low engagement or low cognitive skill?

• log data

• frequency and intensity of access to LMS

• mouse click count

• time-on-Task (i.e., problem-solving or a particular resource)

• reaction time or timeCost (time taken to answer a question or complete an exercise)

• use of hints/help/guide (quantity and quality)

• artifacts (portfolio, essay, projects, lab reports)

• assessment data (test, exam, quiz)

• patterns in errors, mistakes, misconceptions about a particular topic/concept

• number of videos watched

• video-watching frequency and duration (pause, playback, adjusted speed, embedded questions)

• number of completed assignments/submission

• materials’ annotation, highlight, notes, summary

• page turning/jumping

• discussion forums (viewing, posting, voting, updating)

• bookmarks (created, shared, commented)

• Library usage

• Survey data

• Social networks (number of connections and community membership)

How could these metrics or indicators be used to support teaching and learning? LA uses such metrics to examine the relationship between online engagement indicators and learning outcomes. Learning outcomes are defined as retention/completion and final grade or gain score. Although there is a general consensus about a positive correlation between students’ engagement and their learning outcomes, the relevant literature seems to report contradictory findings or sometimes failed to show any associations.

Macfadyen and Dawson (2010) used learning analytics data to explore the relationship between number of messages a student posted, time spent on educational resources and learning outcomes. Although there was a strong correlation between students active posting and their academic success, there was no significant relation between time spent on educational resources and final grades. These results are at stark contrast with the current LA studies: Yang et al. (2019) reported a positive correlation between reading time spend on e-book learning materials, as measured with time-stamp data, and students’ learning outcomes. Lu et al. (2017) asserted that by using online engagement data collected by LA, it is possible to predict academic performances even after only one-third of the semester’s duration. In another attempt, Moreno-Marcos et al. (2020) included engagement indicators, such as video-watching behaviors, click counts, number of posts to the discussion forums, and answers to a quiz to predict students’ final grades. Participation in discussion forums showed a weak predictive ability, whereas average answer scores had the highest predictive ability.

Studies on MOOCs showed that although active participation in discussion forum (a higher number of posts) was associated with course completion and certification, it was not associated with final achievement (Goldberg et al., 2015). Vu et al. (2015) noted a one-directional relationship: higher score can predict the number of messages or posts, but participation in forums and the number of posts can not necessarily predict the gain score. It should be noted that majority of research on MOOCs considered ‘completion of a course’ as the main outcome and academic achievement.

Engagement Analytics

LA dashboard is defined as “a single display that aggregates different indicators about learners, learning processes and learning contexts into one or multiple visualizations” (Schwendimann et al., 2017). In online or blended (hybrid) courses, instructors can use LA visualization dashboard to get an effective and automatic “engagement tracking’” and inform their pedagogical decisions. Such dashboards can provide insights to the following questions: Do students access LMS regularly or only before exams/submissions? What proportion of reading materials is covered? What resources are used more frequently/seldom? In what order students navigate through the content? How many items or questions in a quiz are answered correctly/wrong?

There are several LA-based tools to report on educational activities, i.e., SNAPP uses comments and posts from discussion forums to visualize the interaction network among learners; GISMO visualizes online learning activities, or LOCO-Analyst summerizes and reports learning activities to instructors. However, most digital learning platforms do not automatically include the advanced tools required for applying LA. Additionally, utilizing these tools is too complex and have features (i.e., statistics) beyond the capability of classroom teachers. Moodle, an open sources and freely available LMS, is used extensively in higher education around the world. Therefore, we briefly describe two free LA dashboards for Moodle.

a) “MEAP: Moodle engagement Analytics Plugin” allows instructors to track students engagement based on their log activity, assessment data and posting to discussion forum (Liu et al., 2015).

b) “LEMO2 CourseExplorer” is an LA dashboard which visualizes students’ performance in both standard overview and customized format. Instructors can apply different filters to a specific learning objects and track daily activity flow, distribution of performance (quiz, forum, video-watching), test performance, calendar heatmap, peak activity in terms of time, etc., (Jaakonmäki et al., 2020).

c) “Inspire” is a Moodle Analytics API (application programming interface) that provides descriptive and predictive analytics engine by implementing machine learning backends (Romero and Ventura, 2020).

Gaps and Challenges facing LA

Notwithstanding the fact that LA has a great potential to support teachers and improve learning, there are a number of issues (and potential measures to ameliorate them) which should be considered in using LA:

1. Using LA to track digital traces and monitor students’ behavior raises serious question about data security, privacy, user protection and ethics. Enhancing LMS security, setting authorized access, informed consent of students, as well as clear instructions about how data will be stored, accessed, or used are among potential solutions.

2. LA dashboards provide instructors with a summary of students’ engagement. The decision and action to be taken are left to instructors or supporting staff. Drawing actionable knowledge from dashboards needs considerable Pedagogical Content Knowledge (PCK). This mandates extensive investment in Teacher Professional Development, both in pre-service and in-service programs. No matter how powerful or expensive an LA dashboard is, it can fulfill its potentials only through appropriate human judgement, decision and action.

3. Student-facing dashboards are claimed to foster reflection and Self-regulated Learning (SRL). However, visualization of activity per se may not automatically promote awareness in learners. Majority of LA experts believe students are unable to interpret what analytic dashboards are suggesting them or fail to take appropriate action (Winne, 2018).

4. Many scholars warned against too much reliance on teachers’ intervention based on LA dashboards due to the risk of reducing students’ autonomy and responsibility of taking charge of their own learning (Bodily and Verbert, 2017).

5. Quantitative, easy-to-display measures (i.e., presence, frequency or time spent) cannot fully portray a purposeful and deep engagement in learning. Dawson and Siemens (2014) called for “LA tools and techniques that go beyond surface-level analytics which uses diverse, alternative assessment methods which reflect twenty-first century education, a kind of complex and multimodal learning.” More qualitative indicators and multi-modal data types should be considered. For example, if a student spends less time on a task, it might be due to his superior background knowledge, better task management strategy or lack of engagement. LA can exploit Machine Learning and AI-based techniques, i.e., Natural Language Processing (NLP) and Deep Neural Networking (DNN), to distinguish between the quantity (frequently posting shallow comments in a forum) and the quality of engagement (deep, meaningful reasoning in a discussion forum). Furthermore, ML techniques build predictive models which facilitate timely intervention.

6. Finally, although LA tracks and monitors students activities in VLE, a great part of learning may happen offline: the time students spend reading textbooks, reflecting, solving problems, or completing a field project, contributes greatly to learning, especially to self-regulated learning, which can not be captured by LA. Therefore, not being active in online environments cannot be equated with a complete lack of learning. This might explain why some studies failed to find any clear correlational patterns between log data and final scores.

Conclusion

Given the importance that existing literature attach to students’ engagement and the negative consequence of disengagement (i.e., higher dropout, failure, decreased intrinsic motivation), this study explored how LA can be used to continually and formatively track digital traces of engagement through interaction data, i.e., time-stamped logs. Traditionally, “perceived engagement” was measured through self-reported scales, interview or teacher observation. LA can extract objective engagement features automatically, unobtrusively (without interrupting learning) and ubiquitously. Such data can be used in predicting dropout and failure, giving early warnings to teachers about students who are struggling or losing interest, and support timely teachers’ appropriate intervention.

LA, as a data-driven approach, has a great potential to give an objective picture of learning by uncovering the relations between the processes of learning (e.g., engagement) and the end-product of learning (e.g., achievement). Metrics/indicators of engagement in VLE as well as freely available LA dashboards were delineated. Based on the potential challenges pointed out, we suggest the followings for the future research:

1. Complementary research approaches which triangulate multifaceted evidence from multiple sources, i.e., students’ self-reported data and LA trace data can be combined with sensor-data. Wearables such as eye-movement tracking, non-invasive EEG, fMRI, Augmented Reality (AR), smart glasses can measure neurological status, i.e., alertness, as well as motivational and affective states, i.e., while confusion associates positively with learning, boredom and constant frustration are negatively correlated with learning outcomes.

2. Robust evaluation of LA dashboards as a pedagogical tool, its impact on students’ awareness, self-regulated learning, engagement, outcomes, etc.

3. Optimal design of LA dashboards: there should be a shift from “what elements” are displayed towards ”how” combining metrics of engagement support future engagement. Furthermore, machine learning algorithms, such as Support Vector Machine (SVM) and Artificial Neural Networks (ANN), can be used to develop more fine-tuned and meaningful classification of engagement’s levels. This could support instructors in providing timely and tailored pedagogical interventions.

As Worsley et al. (2021) mentioned: raw data collected by LA or any other tools can seldom give clear, immediate answers to any research questions. The goal of data analysis is to transform data into actionable insights, which requires an in-depth understanding of the local context by researchers (e.g., setting, participants, available resources and analytic techniques). LA research community needs to find ways to encourage more involvement of main stakeholders (e.g., students, teachers, policy-makers) to find solutions to technical, methodological, and practical challenges which are briefly reviewed in this paper. For more discussion see, (Bonafini et al., 2017, Drachsler and Greller 2012, Gardner et al., 2020, Hill 2013, Hussain et al., 2018, Lei et al., 2018, Yin et al., 2019).

Author Contributions

The author confirms being the sole contributor of this work and has approved it for publication.

Conflict of Interest

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Acknowledgments

We acknowledge support by the Open Access Publication Fund of University Library Passau.

References

Aluja-Baneta, T., Sanchob, M., and Vukic, I. (2019). Measuring Motivation from the Virtual Learning Environment in Secondary Education. J. Comput. Sci. 36, 1–7. doi:10.1016/j.jocs.2017.03.007

CrossRef Full Text | Google Scholar

Astin, A. W. (1984). Student Involvement: A Developmental Theory for Higher Education. J. Coll. Student Personnel 25 (4), 297–308.

Google Scholar

Bodily, R., and Verbert, K. (2017). Review of Research on Student-Facing Learning Analytics Dashboards and Educational Recommender Systems. IEEE Trans. Learn. Technol. 10 (4), 405–418. doi:10.1109/tlt.2017.2740172

CrossRef Full Text | Google Scholar

Bonafini, F. C., Chae, C., Park, E., and Jablokow, K. (2017). How Much Does Student Engagement with Videos and Forums in a MOOC Affect Their Achievement. Online Learn. J. 21 (4), 223–240. doi:10.24059/olj.v21i4.1270

CrossRef Full Text | Google Scholar

Dawson, S., and Siemens, G. (2014). Analytics to Literacies: The Development of a Learning Analytics Framework for Multiliteracies Assessment. Int. Rev. Res. Open Distance Learn. (Athabasca University Press) 15 (4).

CrossRef Full Text | Google Scholar

DeBoer, J., Ho, A. D., Stump, G. S., and Breslow, L. (2014). Changing "Course". Educ. Res. 43 (2), 74–84. doi:10.3102/0013189x14523038

CrossRef Full Text | Google Scholar

Deng, R., Benckendorff, P., and Benckendorff, P. (2017). A Contemporary Review of Research Methods Adopted to Understand Students' and Instructors' Use of Massive Open Online Courses (MOOCs). Int. J. Inf. Edu. Tech. 7 (8), 601–607. doi:10.18178/ijiet.2017.7.8.939

CrossRef Full Text | Google Scholar

Deng, R., Benckendorff, P., and Gannaway, D. (2020). Learner Engagement in MOOCs: Scale Development and Validation. Br. J. Educ. Technol. 51 (1), 245–262. doi:10.1111/bjet.12810

CrossRef Full Text | Google Scholar

Drachsler, H., and Greller, W. (2012). “The Pulse of Learning Analytics Understandings and Expectations from the Stakeholders,” in Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, April 2012, 120–129. doi:10.1145/2330601.2330634

CrossRef Full Text | Google Scholar

Eom, S. B., Wen, H. J., and Ashill, N. (2006). The Determinants of Students' Perceived Learning Outcomes and Satisfaction in University Online Education: An Empirical Investigation*. Decis. Sci J Innovative Educ 4 (2), 215–235. doi:10.1111/j.1540-4609.2006.00114.x

CrossRef Full Text | Google Scholar

Finn, J. D., Pannozzo, G. M., and Voelkl, K. E. (1995). Disruptive and Inattentive-Withdrawn Behavior and Achievement Among Fourth Graders. Elem. Sch. J. 95, 421–434. doi:10.1086/461853

CrossRef Full Text | Google Scholar

Fredricks, J. A., Blumenfeld, P. C., and Paris, A. H. (2004). School Engagement: Potential of the Concept, State of the Evidence. Rev. Educ. Res. 74 (1), 59–109. doi:10.3102/00346543074001059

CrossRef Full Text | Google Scholar

Gardner, C., Jones, A., and Jefferis, H. (2020). Analytics for Tracking Student Engagement. J. Interactive Media Edu. 22 (1), 1–7. doi:10.5334/jime.590

CrossRef Full Text | Google Scholar

Goldberg, L. R., Bell, E., King, C., O’Mara, C., Mclnerney, F., Robinson, A., et al. (2015). Relationship Between Participants’ Level of Education and Engagement in their Completion of the Understanding Dementia Massive Open Online Course. BMC medical Education 15 (1), 60. doi:10.1186/s12909-015-0344-z

PubMed Abstract | CrossRef Full Text | Google Scholar

Handelsman, M. M., Briggs, W. L., Sullivan, N., and Towler, A. (2005). A Measure of College Student Course Engagement. J. Educ. Res. 98 (3), 184–192. doi:10.3200/joer.98.3.184-192

CrossRef Full Text | Google Scholar

Hill, P. (2013). Some Validation of MOOC Student Patterns Graphic. Retrieved from http://mfeldstein.com/validationmooc-student-patterns-graphic/.

Google Scholar

Hussain, M., Zhu, W., Zhang, W., and Abidi, S. M. R. (2018). Student Engagement Predictions in an E-Learning System and Their Impact on Student Course Assessment Scores. Comput. Intell. Neurosci. 2018, 21. doi:10.1155/2018/6347186

PubMed Abstract | CrossRef Full Text | Google Scholar

Jaakonmäki, R., vom Brocke, J., Dietze, S., Drachsler, H., Fortenbacher, A., Helbig, R., et al. (2020). Learning Analytics Cookbook: How to Support Learning Processes through Data Analytics and Visualization. CHam: Springer International Publishing.

Google Scholar

Joksimović, S., Poquet, O., Kovanović, V., Dowell, N., Mills, C., Gašević, D., et al. (2018). How Do We Model Learning at Scale? A Systematic Review of Research on MOOCs. Rev. Educ. Res. 88 (1), 43–86. doi:10.3102/0034654317740335

CrossRef Full Text | Google Scholar

Kuh, G. D. (2009). What Student Affairs Professionals Need to Know about Student Engagement. J. Coll. Student Dev. 50 (6), 683–706. doi:10.1353/csd.0.0099

CrossRef Full Text | Google Scholar

Lee, C.-A., Tzeng, J.-W., Huang, N.-F., and Su, Y.-S. (2021). Prediction of Student Performance in Massive Open Online Courses Using Deep Learning System Based on Learning Behaviors. Educ. Tech. Soc. 24 (3), 130–146. Available at: https://www.jstor.org/stable/10.2307/27032861.

Google Scholar

Lee, J.-S. (2014). The Relationship between Student Engagement and Academic Performance: Is it a Myth or Reality? J. Educ. Res. 107 (3), 177–185. doi:10.1080/00220671.2013.807491

CrossRef Full Text | Google Scholar

Lei, H., Cui, Y., and Zhou, W. (2018). Relationships between Student Engagement and Academic Achievement: A Meta-Analysis. Soc. Behav. Pers 46 (3), 517–528. doi:10.2224/sbp.7054

CrossRef Full Text | Google Scholar

Liu, D.-Y. T., Froissard, J.-C., Richards, D., and Atif, A. (2015). “An Enhanced Learning Analytics Plugin for Moodle: Student Engagement and Personalised Intervention,” in Globally Connected, Digitally Enabled. Proceedings Ascilite in Perth. Editors T. Reiners, B. R. von Konsky, D. Gibson, V. Chang, L. Irving, and K. Clarke, 168–177.

Google Scholar

Lu, O. H. T., Huang, J. C. H., Huang, A. Y. Q., and Yang, S. J. H. (2017). Applying Learning Analytics for Improving Students Engagement and Learning Outcomes in an MOOCs Enabled Collaborative Programming Course. Interactive Learn. Environments 25 (2), 220–234. doi:10.1080/10494820.2016.1278391

CrossRef Full Text | Google Scholar

Kahan, D. M., Landrum, A., Carpenter, K., Helft, L., and Jamieson, K. H. (2017). Science Curiosity and Political Information Processing. Political Psychology 38 (Suppl 1), 179–199. doi:10.1111/pops.12396

CrossRef Full Text | Google Scholar

Kovanović, V., Joksimović, S., Poquet, O., Hennis, T., de Vries, P., Hatala, M., et al. (2019). Examining Communities of Inquiry in Massive Open Online Courses: The Role of Study Strategies. The Internet and Higher Education 40 (30), 20–43. doi:10.1016/j.iheduc.2018.09.001

CrossRef Full Text | Google Scholar

Macfadyen, L. P., and Dawson, S. (2010). Mining LMS Data to Develop an “Early Warning System” for Educators: A Proof of Concept. Comput. Edu. 54, 588–599. doi:10.1016/j.compedu.2009.09.008

CrossRef Full Text | Google Scholar

Marks, H. M. (2000). Student Engagement in Instructional Activity: Patterns in the Elementary, Middle, and High School Years. Am. Educ. Res. J. 37, 153–184. doi:10.3102/00028312037001153

CrossRef Full Text | Google Scholar

Moreno-Marcos, P. M., Alario-Hoyos, C., Muñoz-Merino, P. J., and Delgado Kloos, C. (2020). Re-defining, Analyzing and Predicting Persistence: Using Student Events in Online Learning. Appl. Sci. 10 (5), 1–24. doi:10.3390/app10051722

CrossRef Full Text | Google Scholar

Morris, L. V., Finnegan, C., and Wu, S.-S. (2005). Tracking Student Behavior, Persistence, and Achievement in Online Courses. Internet Higher Edu. 8 (3), 221–231. doi:10.1016/j.iheduc.2005.06.009

CrossRef Full Text | Google Scholar

Ouimet, J., and Smallwood, R. (2005). CLASSE--the Class-Level Survey of Student Engagement. Assess. Update 17 (6), 13–15.

Google Scholar

Reschly, A. L., and Christenson, S. L. (2012). “Jingle, Jangle, and Conceptual Haziness: Evolution and Future Directions of the Engagement Construct,” in Handbook of Research on Student Engagement. Editors S. L. Christenson, A. L. Reschly, and C. Wylie (New York: Springer), 3–19. doi:10.1007/978-1-4614-2018-7_1

CrossRef Full Text | Google Scholar

Romero, C., and Ventura, S. (2020). Educational Data Mining and Learning Analytics: An Updated Survey. Wires Data Mining Knowledge Discov. 10, e1355. doi:10.1002/widm.1355

CrossRef Full Text | Google Scholar

Schwendimann, B. A., Rodríguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., et al. (2017). Perceiving Learning at a Glance: A Systematic Literature Review of Learning Dashboard Research. IEEE Trans. Learn. Technol. 10 (1), 30–41. doi:10.1109/tlt.2016.2599522

CrossRef Full Text | Google Scholar

Siemens, G. (2012). “Learning Analytics: Envisioning a Research Discipline and a Domain of Practice,” in Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, April 2012, 4–8.

Google Scholar

Sinatra, G. M., Heddy, B. C., and Lombardi, D. (2015). The Challenges of Defining and Measuring Student Engagement in Science. Educ. Psychol. 50 (1), 1–13. doi:10.1080/00461520.2014.1002924

CrossRef Full Text | Google Scholar

Vu, D., Pattison, P., and Robins, G. (2015). Relational Event Models for Social Learning in MOOCs. Social Networks 43, 121–135. doi:10.1016/j.socnet.2015.05.001

CrossRef Full Text | Google Scholar

Whittemore, R., and Knafl, K. (2005). The Integrative Review: Updated Methodology. J. Adv. Nurs. 52, 546–553. doi:10.1111/j.1365-2648.2005.03621.x

CrossRef Full Text | Google Scholar

Winne, P. H. (2018). “Handbook of Self-Regulation of Learning and Performance,” in Cognition and Metacognition in Self-Regulated Learning. Editors D. H. Schunk, and J. A. Greene (New York: Routledge), 36–48.

Google Scholar

Worsley, M., Martinez-Maldonado, R., and D'Angelo, C. (2021). A New Era in Multimodal Learning Analytics: Twelve Core Commitments to Ground and Grow MMLA. J. Learn. Analytics 8 (3), 10–27. doi:10.18608/jla.2021.7361

CrossRef Full Text | Google Scholar

Yang, C. C., Flanagan, B., Akçapınar, G., and Ogata, H. (2019). “Investigating Subpopulation of Students in Digital Textbook reading Logs by Clustering,” in Proceedings of the 9th International Conference on Learning Analytics and Knowledge (LAK’19), Tempe, AZ, Mar 2019 (Tempe, AZ: Society for Learning Analytics Research), 465–470.

Google Scholar

Yin, C., Yamada, M., Oi, M., Shimada, A., Okubo, F., Kojima, K., et al. (2019). Exploring the Relationships between reading Behavior Patterns and Learning Outcomes Based on Log Data from E-Books: A Human Factor Approach. Int. J. Human–Computer Interaction 35 (4-5), 313–322. doi:10.1080/10447318.2018.1543077

CrossRef Full Text | Google Scholar

Keywords: engagement, learning analytics (LA), technology -enhanced assessment, dashboards and visualization, LMS—learning management systems

Citation: Caspari-Sadeghi S (2022) Applying Learning Analytics in Online Environments: Measuring Learners’ Engagement Unobtrusively. Front. Educ. 7:840947. doi: 10.3389/feduc.2022.840947

Received: 21 December 2021; Accepted: 11 January 2022;
Published: 25 January 2022.

Edited by:

April Lynne Zenisky, University of Massachusetts Amherst, United States

Reviewed by:

Hui Yong Tay, Nanyang Technological University, Singapore
Timothy Mark O'Leary, University of Melbourne, Australia

Copyright © 2022 Caspari-Sadeghi. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Sima Caspari-Sadeghi, sima.caspari-sadeghi@uni-passau.de

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.