Skip to main content

OPINION article

Front. Psychol., 17 May 2022
Sec. Educational Psychology
This article is part of the Research Topic Insights in Educational Psychology 2021 View all 37 articles

Recent Development in University Student Learning Research in Blended Course Designs: Combining Theory-Driven and Data-Driven Approaches

  • 1Department of Pedagogy and Psychology, Faculty of Education, The University of Hradec Králové, Hradec Králové, Czechia
  • 2Griffith Institute for Educational Research, Griffith University, Brisbane, QLD, Australia

University Student Learning in Blended Course Designs

Advances in Internet and computer-based technologies have increased the growth of alternative learning spaces, creating entirely new ways of conceptualizing and assessing the learning experience of students (Wong, 2019). Research has repeatedly found positive impacts of various forms of technology-enhanced learning, such as learning embedded in social networking sites, web-conferencing, webinars, e-portfolio, digital games, mobile apps, and virtual and second worlds (Jarvoll, 2018; Winkelmann et al., 2020). The higher education sector has also shown a rapid development in the use of computer and web-based technologies, which has affected experiences of learning of a significant proportion of university students worldwide. This change has resulted in the widespread use of diverse online spaces developed for learning in the form of e-learning courses and/or a combination of face-to-face and online delivery systems, known as blended course designs (Shin et al., 2018). In a synergy of research foci and research methods of university student learning in blended contexts, Bliuc et al. (2007) defines blended course designs as “systematic combination of co-present (face-to-face) interactions and technologically-mediated interactions between students, teachers and learning resources” (p. 234).

More recently, the coronavirus pandemic (COVID-19) emergency has required higher education learning and teaching around the world to rapidly respond, in particular, redeploying even more learning and teaching activities to virtual learning spaces to promote physical distancing. As a result, more and more face-to-face courses have been delivered as blended courses (Mali and Lim, 2021). Under such circumstances, it becomes vital importance to understand student learning experience in blended course designs, its relation to various forms of learning outcomes, and the key factors which may impact such experience. However, there are many challenges to study the student experience in blended course designs as it is a complex phenomenon (Ma and Lee, 2021). Compared with learning in a single mode (i.e., fully face-to-face courses or fully online courses), in blended course designs, students are increasingly involved in decision-making in the learning process, such as with whom to work in a classroom tutorial and with whom to discuss in online forum, how many hours they learn online, whether to study in a physical library or log onto an online database, whether to collaborate in a face-to-face laboratory or whether to collaborate in virtual reality. These decisions require students to move back and forth between face-to-face and online contexts, and across physical and virtual learning environments (Han and Ellis, 2020a). As a result, the complexity of student experience in blended learning involves an interplay of a wide range of factors, which include students' cognition (e.g., conceptions, approaches, and perceptions in learning) (Trigwell and Prosser, 2020); their choices of social interactions in learning (e.g., with whom to collaborate and the mode of the collaborations) (Hadwin et al., 2018); and the material elements across both physical and virtual learning spaces in face-to-face and online learning components (e.g., students' choices of learning spaces and their interactions with online learning activities) (Laurillard, 2013). As such, evaluation of university student learning experience in blended course designs requires research methods that move beyond approaches that do not routinely investigate the combined contributions of learners and the material things involved in learning to achieve academic performance, (Wu et al., 2010; López-Pérez et al., 2011).

Theory-Driven Approaches

Traditionally, research into student learning experience and academic performance in higher education has largely adopted theory-driven approaches, which test hypotheses derived from theories in educational psychology, learning sciences, and pedagogy and curriculum research (Trigwell and Prosser, 2020). Based on the accumulated empirical evidence which may support or refute the hypotheses, theories and models of learning have been constantly refined, modified, and updated. The theoretical frameworks departing from theory-driven perspectives predominantly assess various aspects in student learning experience using self-reported instruments and measurements, such as focus group, semi-structured interviews, and Likert-scale questionnaires (Han et al., 2022).

In a number of state-of-the-art articles, for instance, the instruments in the major frameworks on the student learning research in higher education have been reviewed and summarized, including Student Approaches to Learning Research, Self-regulated Learning Research, Information Processing Research, and Student Engagement Research (Lonka et al., 2004; Vermunt and Donche, 2017; Zusho, 2017). Uniformly, the primary means to collect data in these frameworks are self-reported questionnaires, such as Study Process Questionnaire (SPQ; Biggs et al., 2001), the Revised Approaches to Studying Inventory (RASI; Entwistle and McCune, 2004), and the Inventory Learning to Teach Process (ILTP; Endedijk et al., 2016) and Motivated Strategies for Learning Questionnaire (MSLQ; Pintrich, 2004); Inventory of Learning Patterns (ILP; Donche and Van Petegem, 2008), Course Experience Questionnaire (CEQ; Ramsden, 1991), and the Inventory of Perceived Learning Environments Extended (IPSEE; Könings et al., 2012).

However, the self-reported measures have been criticized for being subjective and have been questioned about their accuracy in describing students' use of learning approaches and strategies in real learning contexts (Zhou and Winne, 2012). In addition, the self-reported measures and data also suffers from their limited capacities to represent the complex (e.g., using multiple indicators) and dynamic (e.g., changes over time) nature of student learning behaviors. To improve the insights of contemporary university student experiences of learning, suggestions have been put forward to expand the current self-reporting methods by including other types of measurements to study student learning (Vermunt and Donche, 2017). In this regard, learning analytics research is a promising avenue. For instance, Richardson (2017) suggested “The rapidly expanding field of learning analytics provides both researchers and practitioners with the opportunity to monitor students' strategic decisions in online environments in minute detail and in real time (p. 359).”

Data-Driven Approaches

The recent development of educational technology has produced prolific studies using learning analytics, which enables a capacity to collect rich and detailed digital traces of students' interactions with a variety of online learning resources and activities. The type of digital trace/log data, also known as the observational data, have the advantage of offering descriptions of student learning behaviors and strategies relatively more objectively and in a more granular details than using self-reported methods (Siemens, 2013; Baker and Siemens, 2014). Departing from data-driven approaches, learning analytics research has emerged as a growing area and has gradually gained popularity in student learning in higher education (Sclater et al., 2016). It employs advanced data mining techniques and algorithms to process the observational analytic data in relation to students' demographic information, which has been increasingly applied in various domains in higher education sector, such as advising students' career choice (Bettinger and Baker, 2013); detecting at risk students to improve retention (Krumm et al., 2014); providing personalized feedback (Gibson et al., 2017); identifying patterns of learning tactics and strategies (Chen et al., 2017); facilitating collaborative learning (Kaendler et al., 2015); monitoring students' affect in learning (Ocumpaugh et al., 2014); and predicting their academic learning outcomes (Romero et al., 2013). However, the data-driven approaches are often fragmented from educational theories and rely purely on empiricism, which limit the insights they can offer for directing pedagogical innovations and reforms, supporting learning design, fostering quality learning experience, and improving academic performance (Buckingham Shum and Crick, 2012).

Despite its wide applications, learning analytics research have received criticism that the data-driven approaches it relies on is fragmented from educational theory but overly focuses on quantitative number (Rodríguez-Triana et al., 2015). The patterns and models of student learning derived from such data-centric perspectives without proper guidance from educational theories are often result in erroneous interpretation, which have limited insights for generating actionable knowledge in order to locate learning barriers and to offer ideas for teaching practice and curriculum design (Wong and Li, 2020; Han and Ellis, 2021).

A Combined Approach

Recognizing the limitations of an overly data-centric approach used in learning analytics research, researchers have proposed to adopt a more holistic approach when designing research so that student learning behaviors and patterns can be captured in a comprehensive manner, and big data modeling and interpretation can be guided via sound theories (Lockyer et al., 2013; Rienties and Toetenel, 2016). This has resulted in an increasing amount of research using a combined approach, which employs both self-reported and observational instruments to measure student learning experience in a complementary manner (Gašević et al., 2015).

Reviewing the existing research using a combined approach suggests that these studies have two different aims. One aim is to examine how a combined approach may improve the explanatory power of predicting student learning outcomes by including both self-reported and observational measures of aspects in the processes of student learning (Reimann et al., 2014). Despite basing on different learning theories, the majority of studies in this line of inquiry have reported that combing the self-reported and observed measures of student learning have significantly improved the variance explained in the prediction of student academic achievement than using either self-reporting or observational method alone (Tempelaar et al., 2015; Han and Ellis, 2020b). For example, using multiple regression analyses, Pardo et al. (2017) found that students' reported anxiety in learning and use of self-regulated learning strategies could only explain 7% of variance in students' academic performance; whereas adding the observed frequency of students' online interactions into the regression model could explained 32% of variance in students' academic performance, significantly increasing the total variance explained by 25%. In another study which adopted Student Approaches to Learning research framework, Ellis et al. (2017) reported similar findings: while students' self-reported use of approaches to learning only predicted 9% of the variance in their academic learning outcomes, including students' observed online learning events in the multiple regression equation explained an extra of 25% in the students' learning outcomes.

Another aim of the studies which combine the self-reported and observational measures is to investigate the consistency between the two methods in terms of describing student experiences of learning (Reimann et al., 2014). Similarly, research with this aim has also departed from different learning perspectives and has diverse foci, such as using self-reported questionnaires to assess students' intrinsic motivation, test anxiety, self-efficacy, engagement, effort expenditure, achievement goal, learning orientations and motives on the one hand. On the other hand, a diversity of observed indicators of students' online learning, including frequency of clicks, completion of online learning tasks, duration of online learning events, as well as time-stamped sequences of online learning behaviors have been collected through digital traces to derive students' online learning tactics, strategies, and approaches (Gašević et al., 2017; Pardo et al., 2017; Han et al., 2020; Sun and Xie, 2020; Ober et al., 2021). However, inconclusive results have been reported among studies with this aim. The non-alignment between the descriptions and categorisations of student learning experience by self-reports and observations have been found in a number of studies.

For instance, Gašević et al. (2017) found that there was no significant difference of self-reported using surface learning strategies between students who were categorized as deep and surface learners according to their observed online learning strategies. In contrast, consistency between students' self-reported learning orientations (as measured by their approaches to learning and perceptions of learning) and their online participation was observed in Han et al. (2020). Students who were categorized as having an “understanding” learning orientation (i.e., the learning was oriented toward gaining an in-depth understanding of the subject matter) were also observed to participate more online learning than those who were categorized as having a “reproducing” learning orientation (i.e., the learning was oriented toward reproducing facts and satisfying course requirements).

These inconsistent results are possibly caused by (1) different learning theories have been adopted by these studies (e.g., self-regulated learning, Student Approaches to Learning, student engagement); and (2) the diverse means of how students' online learning is measured (e.g., frequency, duration, temporal sequence, or a combination of multiple indicators). Until a firmer conclusion can be made, many more studies are required to examine the extent to which the self-reported and observational measures of student learning experience align with each other.

The strengths of a combined approach lie in multiple ways. First, it has an advantage of offering richer information in the way of predicting student learning outcomes over using a single approach, with each approach supplementing the other. While the observational measures are able to provide objective evidence as to what students actually do in their learning (Fincham et al., 2018), they do not, however, have capacity to reflect students' intents and motives behind the ways they learn as in the self-reported studies (Asikainen and Gijbels, 2017; Gerritsen-van Leeuwenkamp et al., 2019). Second, a combined approach can serve as a triangulation to check the validity of the results derived from either theory-driven or data-driven approaches. Third, the multiple data collection and analysis methods used in the combined approach also strengthen the analytical power of the analyses. All the merits of combining theory-driven and data-driven approaches point out its future applications to advance university student learning research and its potential to tackle the other complex issues of contemporary student experiences of learning.

Author Contributions

FH contributed to the conception of the work, drafted the work and revised it critically for important intellectual content, approved the final version of the paper to be published, and agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Funding

This study was supported by the International mobilities for research activities of the University of Hradec Králové II, CZ.02.2.69/0.0/0.0/18_053/0017841.

Conflict of Interest

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Asikainen, H., and Gijbels, D. (2017). Do students develop towards more deep approaches to learning during studies? A Systematic review on the development of students' deep and surface approaches to learning in higher education. Educ. Psychol. Rev. 29, 205–234. doi: 10.1007/s10648-017-9406-6

CrossRef Full Text | Google Scholar

Baker, R., and Siemens, G. (2014). ”Educational data mining and learning analytics,” in The Cambridge Handbook of the Learning Sciences, 2nd ed, eds Sawyer R (New York, NY: Cambridge University Press), 253–272.

Google Scholar

Bettinger, E., and Baker, R. (2013). The Effects of student coaching: an Evaluation of a randomized experiment in student advising. Educ. Eval. Policy Anal. 36, 3–19. doi: 10.3102/0162373713500523

CrossRef Full Text | Google Scholar

Biggs, J., Kember, D., and Leung, D. Y. P. (2001). The revised two-factor study process questionnaire: R-SPQ-2F. Br. J. Educ. Psychol. 71, 133–149. doi: 10.1348/000709901158433

PubMed Abstract | CrossRef Full Text | Google Scholar

Bliuc, A. M., Goodyear, P., and Ellis, R. A. (2007). Research focus and methodological choices in studies into students' experiences of blended learning in higher education. Intern. High. Educ. 10, 231–244. doi: 10.1016/j.iheduc.2007.08.001

CrossRef Full Text | Google Scholar

Buckingham Shum, S., and Crick, R. (2012). “Learning dispositions and transferable competencies: pedagogy, modelling and learning analytics,” in Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (New York: ACM Press), 92–101.

Google Scholar

Chen, B., Resendes, M., Chai, C., and Hong, H. (2017). Two tales of time: Uncovering the significance of sequential patterns among contribution types in knowledge-building discourse. Interact. Learn. Environ. 25, 162–75. doi: 10.1080/10494820.2016.1276081

CrossRef Full Text | Google Scholar

Donche, V., and Van Petegem, P. (2008). “The validity and reliability of the short inventory of learning patterns,” in Style and cultural differences: How can organisations, regions and countries take advantage of style differences, eds Broeck, Cools, Redmond, and Evans, (Gent: Vlerick Leuven Gent Management School), 49–59.

Google Scholar

Ellis, R. A., Han, F., and Pardo, A. (2017). Improving insights from learning analytics – The Value of combining observed and self-report data on university student learning. Educ. Technol. Soc. 20, 158–169.

Google Scholar

Endedijk, M. D., Brekelmans, M., Sleegers, P., and Vermunt, J. D. (2016). Measuring students' self-regulated learning in professional education: bridging the gap between event and aptitude measurements. Qual. Qty. 50, 2141–2164. doi: 10.1007/s11135-015-0255-4

PubMed Abstract | CrossRef Full Text | Google Scholar

Entwistle, N., and McCune, V. (2004). The conceptual bases of study strategy inventories. Educ. Psychol. Rev. 16, 325–345. doi: 10.1007/s10648-004-0003-0

CrossRef Full Text | Google Scholar

Fincham, E., Gašević, D., Jovanovi,ć, J., and Pardo, A. (2018). From study tactics to learning strategies: an analytical method for extracting interpretable representations. IEEE Trans. Learn. Technol. 12, 59–72. doi: 10.1109/TLT.2018.2823317

CrossRef Full Text | Google Scholar

Gašević, D., Dawson, S., and Siemens, G. (2015). Let's not forget: learning analytics are about learning. TechTrends 59, 64–75. doi: 10.1007/s11528-014-0822-x

CrossRef Full Text | Google Scholar

Gašević, D., Jovanovic, J., Pardo, A., and Dawson, S. (2017). Detecting learning strategies with analytics: Links with self-reported measures and academic performance. J. Learn. Anal. 4, 113–128. doi: 10.18608/jla.2017.42.10

CrossRef Full Text | Google Scholar

Gerritsen-van Leeuwenkamp, K., Joosten-ten Brinke, D., and Kester, L. (2019). Students' perceptions of assessment quality related to their learning approaches and learning outcomes. Stud. Educ. Eval. 63, 72–82. doi: 10.1016/j.stueduc.2019.07.005

PubMed Abstract | CrossRef Full Text | Google Scholar

Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum, S., Tsingos-Lucas, C., and Knight, S. (2017). “Reflective writing analytics for actionable feedback,” in Proceedings of the Seventh International Learning Analytics and Knowledge Conference (New York: ACM Press), 153–162. doi: 10.1145/3027385.3027436

CrossRef Full Text | Google Scholar

Hadwin, A., Järvelä, S., and Miller, M. (2018). “Self-regulation, co-regulation, and shared regulation in collaborative learning environments,” in Handbook of Self-Regulation of Learning and Performance, eds Schunk, D. H., and Greene J. A., (London: Taylor and Francis Group), 83–106.

Google Scholar

Han, F., and Ellis, R. A. (2020a). Personalised learning networks in the university blended learning context. Comunicar 62, 19–30. doi: 10.3916/C62-2020-02

CrossRef Full Text | Google Scholar

Han, F., and Ellis, R. A. (2020b). Combining self-reported and observational measures to assess university student academic performance in blended course designs. Australas. J. Educ. Technol. 36, 1–14. doi: 10.14742/ajet.6369

CrossRef Full Text | Google Scholar

Han, F., and Ellis, R. A. (2021). Predicting students' academic performance by their online learning patterns in a blended course. Educ. Technol. Soc. 24, 191–204.

Google Scholar

Han, F., Ellis, R. A., and Pardo, A. (2022). The descriptive features and quantitative aspects of students' observed online learning: How are they related to self-reported perceptions and learning outcomes? IEEE Trans. Learn. Technol. doi: 10.1109/TLT.2022.3153001

CrossRef Full Text | Google Scholar

Han, F., Pardo, A., and Ellis, R. A. (2020). Students' self-report and observed learning orientations in blended university course design: How are they related to each other and to academic performance? J. Comput. Assist. Learn. 36, 969–980. doi: 10.1111/jcal.12453

CrossRef Full Text | Google Scholar

Jarvoll, A. B. (2018). “I'll have everything in diamonds!”: Students' experiences with Minecraft at school. Stud. Paedagog. 23, 67–90. doi: 10.5817/SP2018-4-4

CrossRef Full Text | Google Scholar

Kaendler, C., Wiedmann, M., Rummel, N., and Spada, H. (2015). Teacher competencies for the implementation of collaborative learning in the classroom: a framework and research review. Educ. Psychol. Rev. 27, 505–536. doi: 10.1007/s10648-014-9288-9

CrossRef Full Text | Google Scholar

Könings, K. D., Brand-Gruwel, S., and Elen, J. (2012). Effects of a school reform on longitudinal stability of students' preferences with regard to education. Br. J. Educ. Psychol. 82, 512–532. doi: 10.1111/j.2044-8279.2011.02044.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Krumm, A., Waddington, R., Teasley, S., and Lonn, S. (2014). “A learning management system-based early warning system for academic advising in undergraduate engineering,” in Learning Analytics: From Research to Practice, eds Larusson, J., and White, B., (New York, NY: Springer), 103–119.

Google Scholar

Laurillard, D. (2013). Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies, London: Routledge.

Google Scholar

Lockyer, L., Heathcote, E., and Dawson, S. (2013). Informing pedagogical action: aligning learning analytics with learning design. Am. Behav. Sci. 57, 1439–1459. doi: 10.1177/0002764213479367

CrossRef Full Text | Google Scholar

Lonka, K., Olkinuora, E., and Mäkinen, J. (2004). Aspects and prospects of measuring studying and learning in higher education. Educ. Psychol. Rev. 16, 301–323. doi: 10.1007/s10648-004-0002-1

CrossRef Full Text | Google Scholar

López-Pérez, M., López-Pérez, M., and Rodríguez-Ariza, L. (2011). Blended learning in higher education: students' perceptions and their relation to outcomes. Comput. Educ. 56, 818–826. doi: 10.1016/j.compedu.2010.10.023

CrossRef Full Text | Google Scholar

Ma, L., and Lee, C. S. (2021). Evaluating the effectiveness of blended learning using the ARCS model. J. Comput. Assist. Learn. 37, 1397–1408. doi: 10.1111/jcal.12579

CrossRef Full Text | Google Scholar

Mali, D., and Lim, H. (2021). How do students perceive face-to-face/blended learning as a result of the Covid-19 pandemic? Int. J. Manag. Educ. 19, 100552. doi: 10.1016/j.ijme.2021.100552

CrossRef Full Text | Google Scholar

Ober, T. M., Hong, M. R., Rebouças-Ju, D. A., Carter, M. F., Liu, C., and Cheng, Y. (2021). Linking self-report and process data to performance as measured by different assessment types. Comput. Educ. 167, 104188. doi: 10.1016/j.compedu.2021.104188

CrossRef Full Text | Google Scholar

Ocumpaugh, J., Baker, R., Gowda, S., Heffernan, N., and Heffernan, C. (2014). Population validity for educational data mining models: a case study in affect detection. Br. J. Educ. Technol. 45, 487–501. doi: 10.1111/bjet.12156

CrossRef Full Text | Google Scholar

Pardo, A., Han, F., and Ellis, R. A. (2017). Combining university student self-regulated learning indicators and engagement with online learning events to predict academic performance. IEEE Trans. Learn. Technol. 10, 82–92. doi: 10.1109/TLT.2016.2639508

CrossRef Full Text | Google Scholar

Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educ. Psychol. Rev. 16, 385–407. doi: 10.1007/s10648-004-0006-x

CrossRef Full Text | Google Scholar

Ramsden, P. (1991). A performance indicator of teaching quality in higher education: the course experience questionnaire. Stud. High. Educ. 16, 129–150. doi: 10.1080/03075079112331382944

PubMed Abstract | CrossRef Full Text | Google Scholar

Reimann, P., Markauskaite, L., and Bannert, M. (2014). e-R esearch and learning theory: what do sequence and process mining methods contribute? Br. J. Educ. Technol. 45, 528–540. doi: 10.1111/bjet.12146

CrossRef Full Text | Google Scholar

Richardson, J. T. (2017). Student learning in higher education: a commentary. Educ. Psychol. Rev. 29, 353–362. doi: 10.1007/s10648-017-9410-x

CrossRef Full Text | Google Scholar

Rienties, B., and Toetenel, L. (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Comput. Hum. Behav. 60, 333–341. doi: 10.1016/j.chb.2016.02.074

CrossRef Full Text | Google Scholar

Rodríguez-Triana, M., Martínez-Monés, A., Asensio-Pérez, J., and Dimitriadis, Y. (2015). Scripting and monitoring meet each other: Aligning learning analytics and learning design to support teachers in orchestrating CSCL situations. Br. J. Educ. Technol. 46, 330–343. doi: 10.1111/bjet.12198

CrossRef Full Text | Google Scholar

Romero, C., López, M. I., Luna, J. M., and Ventura, S. (2013). Predicting students' final performance from participation in on-line discussion forums. Comput. Educ. 68, 458–472. doi: 10.1016/j.compedu.2013.06.009

CrossRef Full Text | Google Scholar

Sclater, N., Peasgood, A., and Mullan, J. (2016). Learning Analytics in Higher Education: A Review of UK and International Practice. London: Jisc.

Google Scholar

Shin, Y., Park, J., and Lee, S. G. (2018). Improving the integrated experience of in-class activities and fine-grained data collection for analysis in a blended learning class. Interact. Learn. Environ. 26, 597–612. doi: 10.1080/10494820.2017.1374980

CrossRef Full Text | Google Scholar

Siemens, G. (2013). Learning analytics: the emergence of a discipline. Am. Behav. Sci. 57, 1380–1400. doi: 10.1177/0002764213498851

CrossRef Full Text | Google Scholar

Sun, Z., and Xie, K. (2020). How do students prepare in the pre-class setting of a flipped undergraduate math course? A latent profile analysis of learning behavior and the impact of achievement goals. Intern. High. Educ. 46, 100731. doi: 10.1016/j.iheduc.2020.100731

CrossRef Full Text | Google Scholar

Tempelaar, D. T., Rienties, B., and Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning analytics in a data-rich context. Comput. Hum. Behav. 47, 157–167. doi: 10.1016/j.chb.2014.05.038

CrossRef Full Text | Google Scholar

Trigwell, K., and Prosser, M. (2020). Exploring University Teaching and Learning: Experience and Context, London: Palgrave Macmillan.

Google Scholar

Vermunt, J. D., and Donche, V. (2017). A learning patterns perspective on student learning in higher education: state of the art and moving forward. Educ. Psychol. Rev. 29, 269–299. doi: 10.1007/s10648-017-9414-6

CrossRef Full Text | Google Scholar

Winkelmann, K., Keeney-Kennicutt, W., Fowler, D., Lazo Macik, M., Perez Guarda, P., and Joan Ahlborn, C. (2020). Learning gains and attitudes of students performing chemistry experiments in an immersive virtual world. Interact. Learn. Environ. 28, 620–634. doi: 10.1080/10494820.2019.1696844

CrossRef Full Text | Google Scholar

Wong, B. T. M., and Li, K. C. (2020). A review of learning analytics intervention in higher education (2011–2018). J. Comput. Educ. 7, 7–28. doi: 10.1007/s40692-019-00143-7

CrossRef Full Text | Google Scholar

Wong, R. (2019). Basis psychological needs of students in blended learning. Interact. Learn. Environ. 1†15. doi: 10.1080/10494820.2019.1703010

CrossRef Full Text | Google Scholar

Wu, J. H., Tennyson, R., and Hsia, T. L. (2010). A Study of student satisfaction in a blended e-learning system environment. Computers and Education 55, 155–164. doi: 10.1016/j.compedu.2009.12.012

CrossRef Full Text | Google Scholar

Zhou, M., and Winne, P. H. (2012). Modeling academic achievement by self-reported versus traced goal orientation. Learning and Instruction, 22, 413–419. doi: 10.1016/j.learninstruc.2012.03.004

CrossRef Full Text | Google Scholar

Zusho, A. (2017). Toward an integrated model of student learning in the college classroom. Educ. Psychol. Rev. 29, 301–324. doi: 10.1007/s10648-017-9408-4

CrossRef Full Text | Google Scholar

Keywords: university student learning research, blended course designs, theory-driven approaches, data-driven approaches, a combined approach

Citation: Han F (2022) Recent Development in University Student Learning Research in Blended Course Designs: Combining Theory-Driven and Data-Driven Approaches. Front. Psychol. 13:905592. doi: 10.3389/fpsyg.2022.905592

Received: 27 March 2022; Accepted: 28 April 2022;
Published: 17 May 2022.

Edited by:

Jesus de la Fuente, University of Navarra, Spain

Reviewed by:

Chia-Lin Tsai, University of Northern Colorado, United States

Copyright © 2022 Han. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Feifei Han, feifei.han@griffith.edu.au

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.