Skip to main content

ORIGINAL RESEARCH article

Front. Psychol., 16 December 2021
Sec. Educational Psychology
This article is part of the Research Topic Higher Education Dropout After COVID-19: New Strategies to Optimize Success View all 15 articles

Learning Beliefs, Time on Platform, and Academic Performance During the COVID-19 in University STEM Students

  • 1Laboratorio de Investigación e Innovación educativa Dirección de Docencia, Universidad de Concepción, Concepción, Chile
  • 2Centro de Investigación en Educación y Desarrollo (CIEDE-UCSC), Departamento Fundamentos de la Pedagogía, Facultad de Educación, Universidad de Católica de la Santísima Concepción, Concepción, Chile
  • 3Programa de Doctorado Educación en Consorcio, Universidad de Católica de la Santísima Concepción, Concepción, Chile
  • 4Departamento de Física, Facultad de Ciencias Físicas y Matemáticas, Universidad de Concepción, Concepción, Chile
  • 5Departamento de Bioquímica y Biología Molecular, Facultad de Ciencias Biológicas, Universidad de Concepción, Concepción, Chile

Due to the closure of universities worldwide because of the COVID-19 pandemic, teaching methods were suddenly transformed to an emergency remote teaching (ERT) modality. Due to the practical nature of STEM courses, students cannot participate in activities in which manipulating objects is necessary for accomplishing learning objectives. In this study, we analyze the relation among STEM students learning beliefs at the beginning of ERT (T1) with their Learning Management systems (LMS) time-on-task and their final academic performance (T2) during the first semester of ERT. We used a prospective longitudinal design. 2063 students (32.3% females) from a university in Chile participated, where the academic year starts in March and finishes in December 2020. We assessed their learning and performance beliefs through an online questionnaire answered at the beginning of the academic period (T1). Then, using learning analytics, time invested in the CANVAS LMS and the academic performance achieved by students at the end of the semester (T2) were assessed. The results show that students mainly stated negative beliefs about learning opportunities during ERT (n = 1,396; 67.7%). In addition, 48.5% (n = 1,000) of students stated beliefs of “medium” academic performance for the first semester (T1). Students with lower learning beliefs at T1 spent less time in the LMS during the semester and had a lower academic performance at T2 than students who had higher learning beliefs at T1. The implications of these findings on the role of instructors and institutions of higher education are discussed.

Introduction

It is expected that the demand for professionals in STEM careers will increase in the coming years (Avendaño Rodríguez and Magaña Medina, 2018; UNESCO, 2019). Furthermore, the World Economic Forum held in 2020 revealed that critical thinking and collaboration are highly demanded competencies and that this need will keep increasing. These aspects, along with other skills, such as foundation literacy and character qualities, are 21st-century skills. Developing these abilities in students and professionals implies the need for implementing active learning teaching strategies in STEM areas (Soler and Dadlani, 2020). Active learning strategies consider students the main responsible for their education by realizing meaningful activities in which the teacher acts as a facilitator or guide during the learning process (Hernández-De-Menéndez et al., 2019).

In this context, research on how to improve STEM students’ teaching and learning is relevant (Hou et al., 2021). However, today, STEM students have undergone a transformation in their learning experiences due to the COVID-19 pandemic (Cannon et al., 2021). The sudden change of teaching modality that all students worldwide suffered due to the pandemic is known as emergency remote teaching (ERT; Bozkurt and Sharma, 2020; Bustamante, 2020). This denomination is because the conditions of online education created during COVID-19 were not planned as expected in other scenarios (Pappas and Giannakos, 2021). Therefore, the improvisation and ingenuity of many instructors, who were not prepared for such a drastic change of teaching modality, prevailed (Hodges et al., 2020; Lobos Peña et al., 2021).

Studies conducted during the ERT period identified that: (a) students fear facing many difficulties while working online and believe that their instructors could not help them enough (Akcil and Bastas, 2020), (b) students show higher motivation for online learning when they perceived greater usefulness and ease in virtual learning tools, so the thoroughness in the choice and planning of resources and activities is crucial (Cicha et al., 2021), and (c) students were more satisfied with online education when they perceived less impact of the pandemic on the preparation and adaptation to the virtual format of their educational institutions (Gonçalves et al., 2020). As valuable as the ERT is, accelerating the implementation of teaching processes caused the loss of several critical elements for its effectiveness (Hodges et al., 2020). Therefore, if not guaranteed, the ERT leads to a modality in which the aim is to replicate face-to-face strategies instead of taking advantage of the resources and benefits of online learning systems. The effectiveness of online education lies heavily in the careful design and preparation of learning resources, activities, and assessments following an instructional design that is appropriate to the course.

Learning Beliefs

Self-efficacy beliefs in the context of online learning refer to students’ beliefs regarding being able to execute, successfully, the tasks and activities presented in the virtual learning environment (Cai et al., 2017; Al-Rahmi et al., 2018); for example, believing in their ability to use the learning management system of their institution. In addition, these technological tools respond to educational processes and the capacity to deploy self-regulated learning that requires greater autonomy, clear goals, among other capabilities involved (Carter et al., 2020; Qetesh et al., 2020). In an online learning context, students report low perceptions of learning (Chen et al., 2018) and low levels of academic self-efficacy (Casanova et al., 2018; Gopal et al., 2021) when the courses do not follow an instructional design specifically created for online courses.

Systematic reviews of the literature indicate that beliefs about student learning can impact academic performance and dropout rates (Richardson et al., 2012; Honicke and Jaclyn, 2016). If self-efficacy is low, student engagement and performance will be low (Van der Houwen et al., 2010; Valle et al., 2015; Borzone, 2017), whereas dropout intention will be higher (Casanova et al., 2018). In the context of the pandemic, instructors and students expressed low learning beliefs about virtual education at the beginning of the academic period in two longitudinal investigations. As a result, students had little confidence in online education’s opportunities regarding the quality of teaching processes, learning materials and activities, and collaborative work with peers and instructors (Camfield et al., 2021; Lobos Peña et al., 2021). Conversely, students obtain better academic performance and are more satisfied with the teaching and learning processes when their learning beliefs are higher (Kostagiolas et al., 2019).

Research of students’ self-efficacy beliefs and behavior in online education is incipient. Despite this, there are already few reports indicating that students who believe they will perform better in an online modality will interact more with learning activities and resources in virtual environments (Ifenthaler, 2020; Ifenthaler and Yau, 2020). In this same area, self-efficacy beliefs have been specified around academic achievement in technology-mediated learning experiences.

Learning Analytics: Time on Platform

Learning analytics is defined as the process of measuring, collecting, analyzing, and reporting data about learners and their contexts to promote learning by considering elements, such as data, data analysis, and intervention measures generated from them (Romero and Ventura, 2020). Concerning the students, the use of analytics allows the integration of information, such as their behavior during the teaching and learning process, their past or current academic performance, sociodemographic information, among others (Zilvinskis and Willis, 2019). These data allow for statistical analysis and predictive models that facilitate the early detection of students at possible risk of failure (Larrabee Sønderlund et al., 2019). Furthermore, the user can predict learners’ success during a course using various performance indicators with learning analytics. For example, you can use grades from previous courses or learners’ current performance. Tracking learner activity in the LMS is also commonly used (Liz-Domínguez et al., 2019).

One of the most studied variables in learning analytics research is the platform time or time-on-task invested by students during online education (Ifenthaler and Yau, 2020). For example, considering when an event starts and ends is especially important when defining how long students are actively working in the LMS. The opportunity to extract this kind of information makes learning analytics data to be considered as the digital footprint left by students in the context of an online course, as it allows to estimate the level of involvement and the effort they deploy during their courses (Miller and Soh, 2013; Rojas-Castro, 2017). Moreover, research in the ERT period has indicated that students who believe they will do better in an online modality interact more with virtual environments’ learning activities and resources (Ifenthaler, 2020; Ifenthaler and Yau, 2020). So, reviewing learning analytics becomes relevant for those seeking to develop intentional pedagogical actions to improve educational outcomes. Nevertheless, it is essential to know and analyze the student’s interaction with the resources and activities, connection times, and connection moments. These will allow us to understand the final performance better and address low performances or behaviors that lead to it (Zhang et al., 2020).

After the pandemic, universities will likely employ blended learning, which considers quality training with specially designed virtual teaching environments linked to face-to-face teaching to enhance students’ educational experience by responding to their needs (McGrath et al., 2021). A blended learning modality will have to incorporate all the knowledge developed by instructors and institutions during the pandemic regarding virtual tools and mix them with the best practices of face-to-face classrooms. Unfortunately, this kind of modality has been scarce in Latin America. Therefore, this paper aims to contribute knowledge supporting virtual tools in STEM undergraduate programs in higher education. In this sense, our objective is to evaluate the relation among STEM students learning beliefs at the beginning of the ERT (T1) with their LMS time-on-task during the first semester of the ERT and their final academic performance (T2).

With this research, our goals are:

1. Describe the learning beliefs of STEM undergraduate students during the COVID-19 pandemic ERT at the beginning of the academic semester (T1).

2. Identify the interaction level with the LMS Canvas after the end of the academic semester (T2) of STEM university students during the ERT

3. Compare the connection time of STEM university students considering variables, such as gender and academic level to which they belong.

4. Analyze the learning beliefs of undergraduate students in the STEM area (T1), considering the student’s interaction with the LMS and the academic performance achieved at the end of the semester (T2).

Materials and Methods

The method used in this research was in the framework of a simple prospective design (Ato et al., 2013). In Figure 1, we describe the measurement timeline.

FIGURE 1
www.frontiersin.org

Figure 1. Description of the measurement moments carried out in the investigation.

Participants

Participants were 2063 undergraduate STEM students from a university in Chile, where the academic year starts in March and finishes in December 2020. Gender distribution was 32.3% (664) females and 67.8% (1399) males. The average age was 21.31 years (SD = 2.64). The distribution according to STEM areas was: 1485 students from Engineering (71.9%), 315 students from Physical Sciences and Mathematics (15.3%), 185 from Chemical Sciences (9.0%), and 78 students from Biological Sciences (3.8%). Concerning academic level, 641 were 1st-year students (31.1%), and 1,422 were students in higher courses (68.9%).

Measurement Instruments

Learning Beliefs

The institution developed and massively implemented a two-item survey at the beginning of the academic period during the ERT due to the COVID-19 pandemic (T1): (1) I think my learning opportunities in online learning will be, and (2) I think my academic performance will be. The first item had two possible response options (1 = worse than in face-to-face learning, 2 = the same as in face-to-face learning), whereas the second item had three options (1 = low, 2 = medium, 3 = high). Therefore, when we talk about learning beliefs, we refer to the student’s beliefs about their opportunities to learn and maintain their academic performance during ERT due to COVID-19.

Time on Learning Management Systems

The institution analyzed the students’ time-on-task during the ERT semester. We defined this variable as the time between two interactions (or events associated with a timestamp) LMS CANVAS, with a 10-min threshold. If the user did not perform any action by 10 min, the session is considered finished. Our definition of the time-on-task threshold was based on the evidence described in the literature (Kovanović et al., 2015) and on our researchers’ experience. It is important to note that the ways to determine time-on-task in LMS are still under investigation because it depends on the context and characteristics of the data (Godwin et al., 2016).

Academic Performance

We measured participants’ grade point average by the average grade obtained over the first semester during the ERT. Each faculty provided this information from the institutional records. In Chile, the grading system is constructed on a scale from 1.0 to 7.0 points. The grades from 6.0 to 7.0 correspond to an academic performance considered as “excellent.” The grades from 5.0 to 5.9 are labeled as “good” grades, while 4.0–4.9 are defined as “satisfactory.” Last, grades from 1.0 to 3.9 are “unsatisfactory,” which means the student failed the course (MINEDUC, 2020).

Procedure

This research was approved by the Ethics Committee of the participating university, confirming the ethical criteria for research with human beings. The informed consent form was presented, describing research goals and characteristics for participation in the study.

The questions on learning beliefs were part of a general questionnaire applied in digital format and sent to the students’ institutional e-mails. The reception of responses to this questionnaire was at the beginning of the academic year, during March 2020 (T1). LMS CANVAS platform (John, 2021) supplies the proportion of time in the virtual classroom. In addition, each faculty provide academic performance from the institutional records. We measured these two variables at the end of the first academic semester, during September 2020 (T2).

Assumptions of normality of the data were checked using the Kolmogorov–Smirnov test with Lilliefors modification (Thode, 2002). We applied the Levene test (Fox and Weisberg, 2018) to verify the constant variance between groups (homoscedasticity). Due to the non-normality of data, presence of outliers, and in some cases, absence of homoscedasticity, we performed Yuen’s test (Yuen, 1974) for the comparison of two groups and one-way ANOVA test for trimmed means (Wilcox and Tian, 2011) for statistical analyses employing more than two groups. The method proposed by Algina et al. (2005) was employed for the effect size analysis of the results. For data analysis, we used RStudio software version 4.0.3 (2020-10-10).

Results

The goal of this study was to analyze learning beliefs during online education (T1) and their link with the time invested by students in the LMS and with the academic performance achieved at the end of the semester (T2) in the context of the ERT 2020. The findings are presented below.

STEM Students’ Beliefs About Online Learning During ERT Context

Regarding undergraduate STEM students’ learning beliefs during the ERT for the COVID-19 pandemic (T1), students mainly stated negative beliefs about learning opportunities (n = 1,396; 67.7%). Only 32.3% (n = 667) of participants declared that they believed online education would provide them with good learning. When analyzing the characteristics of students according to their learning beliefs, we found that there were no statistically significant differences according to students’ gender and the type of school they came from (public, private, subsidized).

When assessing students’ beliefs about their academic performance in the ERT context, 48.5% (n = 1,000) of participants stated beliefs of “medium” academic performance for the first semester, 30.8% declared beliefs of “high” performance, and 20.7% (n = 427) of students stated that they would perform “poorly” in the ERT context. In the case of achievement beliefs, when analyzing the characteristics of the participants in each group, we found that students with higher achievement beliefs [F(2,637.7) = 6.09, p = 0.002] presented higher scores on the mathematics university entrance test or PSU, (M = 665.23; SD = 64.34) than students with lower achievement beliefs (M = 638.978; SD = 83.79).

Connection Time in the Virtual Classroom (LMS) by Students During the ERT Context

Students spent an average of 92.87 h (SD = 81.59) in the virtual classroom during the entire semester (T2). The time spent by students on the platform was analyzed considering gender. Males spent an average of 94.17 h (SD = 85.23) in the virtual classroom, while women spent 90.13 h (SD = 73.35). When analyzing differences in the students’ connection time according to gender, the results were not statistically significant [t(820.19) = 0.0006, p = 0.999]. Therefore, there is no distinction in connection time in the LMS between men and women (see Table 1).

TABLE 1
www.frontiersin.org

Table 1. Differences in students’ platform times as a function of student learning beliefs, gender, and academic year to which they belong.

For the analysis of connection time to the LMS according to academic level (1st-year students regarding students taking second through 4th-year courses), we identified 641 (31.1%) freshmen and 1,422 (68.9%) students attending 2nd-year courses or higher. The 1st-year students spent an average of 99.56 h (SD = 81.23), whereas senior students 89.86 h (SD = 81.62). Statistically significant differences were found [t(806.38) = 3.557, p < 0.001; ES = 0.17] in connection time to virtual classroom according to the academic year of students (see Figure 2). In this case, 1st-year students presented a longer connection time in the LMS than the upper-course students.

FIGURE 2
www.frontiersin.org

Figure 2. Distribution of students’ connection time to the Learning Management systems (LMS) during the emergency remote teaching (ERT) semester according to their academic level.

Students’ Learning Beliefs, Time Spent Online in the Virtual Classroom, and GPA

Learning beliefs were categorized into students who stated positive beliefs regarding their learning and students who declared negative beliefs. Students with positive learning beliefs accessed an average of 102.83 h (SD = 88.46) to the LMS. On the other hand, students with negative beliefs accessed an average of 88.11 (SD = 77.68) to the LMS. When evaluating connection time on the LMS, we found statistically significant differences between the groups [t(690.66) = 3.43, p < 0.001, ES = 0.18], observing that students with positive beliefs about learning spended more hours connected to the LMS (see Table 2).

TABLE 2
www.frontiersin.org

Table 2. Differences in academic performance obtained by participating students at the end of the semester (T2) as a function of performance beliefs during the ERT (T1).

We assessed the differences between students’ academic performance (T2) and their beliefs about academic performance at the beginning of the semester (T1). Students were organized into three performance belief groups (low, medium, high). Participants who stated “low” performance beliefs obtained an average grade of 5.61 (SD = 0.51), while students who declared “medium” performance beliefs obtained on average a mark of 5.72 (SD = 0.52). Finally, students stating “high” academic performance beliefs scored on average 5.78 (SD = 0.55). We found statistically significant differences between the groups [F(2,633.45) = 9.4984, p < 0.001]. Students with “low” performance beliefs (T1) obtained lower grades at the end of the semester (T2) than students with “high” (p < 0.001) and “medium” performance beliefs (p < 0.01; see Figure 3).

FIGURE 3
www.frontiersin.org

Figure 3. Description of academic performance beliefs (T1) and final grade (T2) from STEM students’ during the ERT semester.

Discussion

Due to the COVID-19 pandemic, university students around the world had to continue their training remotely. The ERT is characterized as unplanned and temporary since its implementation is associated with an emergency. Therefore, neither educational institutions, students, nor instructors were prepared to carry out educational processes efficiently in this context. This situation had an impact on the students’ experiences of their university education. However, it was particularly detrimental to students of majors in STEM due to the practical nature of the courses.

We identified the relationship between the following three variables: learning beliefs, time spent on tasks on the LMS, and academic performance achieved at the end of the semester.

STEM Students’ Beliefs About Online Learning During ERT Context

As a result of the present study, more than half of the participants had negative beliefs about online learning during the ERT context. In the published empirical evidence, mixed results were found concerning beliefs about online learning in STEM. For example, an investigation of pharmacy students’ experience during the COVID-19 pandemic indicated that 49% of the participants showed a positive attitude toward online learning, and only 34% of the students identified barriers to online learning (Shawaqfeh et al., 2020). Another research with students from various areas of basic sciences found that participants perceived positive online learning experiences and considered that the situation was handled adequately (Almusharraf and Khahro, 2020). However, contrary to the results presented above, another research reports that although students state that online education is a modality responding positively to their needs, they express concerns regarding pedagogical, logistical, and administrative support from their institutions, negatively impacting their beliefs. Moreover, students state that it is difficult to connect with their professors and classmates (Katz et al., 2021; Rivera-Vargas et al., 2021). These findings agree with the results of this study.

In our context, from October 2019 to February 2020, a social movement developed in Chile due to citizens’ discontent with the government (Morales Quiroga, 2020). Strikes and marches characterized this movement within the second semester of the 2019 academic year. In this period, educational institutions had to implement the ERT modality due to the social situation. After finishing the second semester of 2019 in ERT modality due to the social movement, students started the first semester of 2020 in ERT modality due to the COVID-19 pandemic (Brunner et al., 2020). In the case of the university students in this research, we believe that the online education experience during the social movement accentuated negative beliefs.

To identify the ERT effects of the COVID-19 pandemic, we differentiated between two groups of STEM students: 1st-year students in 2020 and upper-level students. On the one hand, the latter group of students had a traditional learning experience that allowed them to participate in face-to-face cultural activities, meet peers, and engage in STEM activities, such as laboratory practices. On the other hand, the ERT’s effects on the 1st-year students are possibly higher since they did not have face-to-face experiences and all their training has been remote. Concerning academic performance beliefs, a significant number of students stated that they could achieve medium to high performance during the first semester of 2020, i.e., the first ERT semester. This result is similar to one reported by students in other research where they indicate beliefs of having positive or closer to expected results in the ERT scenario (Rager, 2020). This finding could be associated with personal factors of the student, for example, the young people’s level of commitment to their university undergraduate programs, and with their self-efficacy beliefs about completing academic assignments from a digital modality, the use of social support sources (peers and family) and the technological resources available to them for the implementation of academic activities (internet or computers).

In the ERT scenario, the student’s socio-academic integration process was significantly transformed. Their institutional experiences were developed from virtuality, limiting the development of the young people’s academic identity. According to Tinto’s theory, students’ success resides in their ability to integrate socially and academically into the university (Tinto, 1975). This model proposes that students see themselves as part of the educational institution when they can frequently interact with peers, instructors, and the university community. This process increases their commitment to the career, benefiting their academic performance and persistence (Tinto, 2017). In this regard, universities should consider implementing programs and policies that benefit the social integration of students during the period of return to higher education institutions due to the control of the COVID-19 pandemic.

Connection Time in the Virtual Classroom (LMS) by Students During the ERT Context

Regarding students’ time-on-task in the LMS, we analyzed students, interaction with the platform in their courses in ERT modality. This variable was selected to better understand the students’ learning process during this period (Klašnja-Milićević et al., 2017). Our results show that, on average, 1st-year students logged in to the LMS longer than students in higher courses.

Considering that 1st-year students present a higher dropout risk (Bernardo et al., 2015, 2016), we believe these results can be considered positive, especially in STEM students (Jungert et al., 2019). Still, the low perception of learning opportunities and low platform time of STEM students may exacerbate the dropout figures that already existed before the pandemic (Van den Hurk et al., 2019). For such reason, there is a need to strengthen student engagement in the LMS coupled with student satisfaction with the study content and experience (Fleischer et al., 2019). Furthermore, the students’ difficulties during the ERT could explain the low beliefs regarding learning opportunities in this period. For this reason, university authorities must reinforce and maintain the mechanisms of consultation and accompaniment for the implementation of online learning.

Students’ Initial Learning Beliefs, Time Spent Online in the Virtual Classroom, and GPA

Likewise, we found that low beliefs about learning opportunities and academic performance are related to lower time-on-task and lower academic performance achieved at the end of the semester. This result can be explained from the theoretical approach of student self-efficacy. When students have low beliefs or perceptions of learning ability, they present difficulties regarding their academic performance (Bandura, 2012). In the case of the online setting, when students possess positive beliefs toward learning experiences, they interact to a greater extent with the learning activities and resources in the LMS, leading to higher performance (Ifenthaler, 2020).

Participating students had lower expectations about their learning opportunities, although they believed that the ERT would not affect their academic performance. Based on these findings, research indicates that individual and educational factors affect students’ beliefs about virtual learning (Alameri et al., 2020). In this case, uncertainty about how the ERT would unfold, being a new learning experience for instructors and students may have impacted students’ beliefs about their learning opportunities. Therefore, we did not identify significant changes in students’ performance expectations in the ERT. However, Redondo-Gutiérrez et al. (2017) found that when students’ performance expectations relate with taking individual assessments, they tend to be better than in the case of group assessments (Redondo-Gutiérrez et al., 2017). Regarding assessment during the ERT, students were not clear about the evaluative processes of their learning. One explanation for this result could be that the students believed that assessment processes would be individual due to the ERT. As a result, they believed that their grade would be the result of their performance.

Although institutions’ instructional designers and managers understand the difference between ERT and online education, students and instructors usually do not (Chaka, 2020). Only conducting video lectures or leaving the material in repositories is far from representing a successful online educational model. In online education courses, planning and design must follow an instructional model, such as ADDIE or Backward Design (Dean, 2019; Ofosu-Asare et al., 2019). Each learning activity and resource is carefully planned to be implemented through an LMS. Nonetheless, we know that it was necessary to improvise in the ERT period to provide continuity to the educational processes, which resulted in inadequate preparation for both students and instructors. In addition, the uncertainty generated by the crisis could also be reflected in students’ negative beliefs about online learning.

Particularly in STEM courses, the perception of low learning may be influenced by the disciplines’ characteristics. Usually, students learn to implement the scientific practices through hands-on educational experiences in face-to-face laboratories, especially in chemistry, physics, and biology. Although scientific practices can be taught through academic activities, some require students to touch and manipulate elements and instant interaction with their peers. For example, in physics courses, students should learn to manipulate oscilloscopes; in chemistry courses, they should learn to manipulate reagents; and in biology, they should learn to manipulate microscopes. On the other hand, there are exceptions in which the implementation of virtual rather than face-to-face laboratories should not cause negative effects. An example of this is the computer science courses, in which the laboratories were already performed employing computers, using programming languages, such as Python, R, or Fortran. However, it should not be forgotten that such activities existed before the ERT caused by COVID-19 pandemic. The success in using these instructional strategies, whether in the face-to-face or online modality, lies in the importance of linking it with the possibility of teamwork and the realization of adequate feedback by the teacher on the student’s learning.

Given the impossibility of implementing face-to-face laboratory activities for all students, university instructors in STEM faced additional challenges compared to other disciplines. Instructors had to confront an extra challenge to support their students in achieving course learning outcomes: looking for a solution to implement laboratory practices remotely and incorporating active learning techniques (Vogel-Heuser et al., 2020). An option would be implementing simulation-based virtual laboratories where students must modify parameters to see the effects in the experiments (Belford and Moore, 2016). An excellent resource to use is the University of Colorado Boulder’s repository of interactive simulations. Simulations include activity proposals containing the HTML iframe code to embed the simulation directly into the LMS (University Colorado Boulder, 2021). Additionally, there are different alternatives to remote laboratories. An example of these is an educational project of the University of Deusto (Orduña et al., 2011; García-Zubía et al., 2018; Orduña et al., 2018).

Another remedial action could be implementing a peer support system by upper-level training students with high learning beliefs to support 1st-year students with low beliefs in online learning processes (Honkimaki and Tynjala, 2018). Moreover, institutions could also implement a remote help desk system where students can send their queries related to ERT. For the latter initiative, the institution needs access to help desk software and staff trained in pedagogical and technological aspects to provide timely and efficient student responses.

A strength of this research is the use of other forms of measurement to reduce the possible bias that could be generated by only using the self-report as a measurement (De las Cuevas Catresana and González de Rivera, 1992). In this study, the application of questionnaires allowed to assess young people’s learning and performance beliefs. Also, through learning analytics, we evaluated students’ behavior within the LMS during the entire semester and linked it with the academic performance achieved at the end of the period. Employment of learning analytics to assess students’ beliefs, especially in 1st-year students, enables early identification of students at dropout risk and provides personalized support before the student withdraws from the university (Honkimaki and Tynjala, 2018; Wong et al., 2018).

This study has some limitations, which are referred to the following aspects: (a) the presence of biases against the actual assessment of students’ academic performance, since due to the COVID-19 pandemic, the participating university implemented a series of educational policies that could affect the academic results obtained by the students; (b) the participants in this study belong to a single university, which, although it is one of the largest institutions of higher education in Chile, and with a great variety of disciplines, presents its own contextual characteristics that could affect the results; (c) the use of a single indicator (time-on-task) for the construction of the learning analytics variable is insufficient to cover the variety of behaviors that characterize the student’s interaction with the virtual classroom; (d) The measurement of the learning beliefs variable with two items could limit the content validity; and (e) finally, in this investigation it was not possible to identify other variables that could impact the time spent by students in the LMS, such as course design, number of credit hours, among others.

Future research could consider other elements, such as participation in forums, number of activities performed in the LMS, number of resources read and downloaded, among other analytics offered by the LMS (Ifenthaler and Yau, 2020). Also, it would be interesting to study further the effects of course variables (e.g., level, type, design, credit hours), which could impact student academic performance. Investigation along these lines could expand and diversify the sample and conduct similar studies when the pandemic context is overcomed.

This study contributes to the early identification of at-risk students, encouraging pedagogical actions to decrease students’ negative beliefs about online learning. In addition, our results could positively impact the dropout rates found in STEM careers by guiding institutional actions that address beliefs toward online education. These actions are significant given that post-pandemic, it is expected that a large part of Higher Education institutions seeks to promote Blended Learning education within their educational models. All the above consider technological advances, globalization of information, and learning about online education generated during the ERT.

Conclusion

Considering the findings, we concluded that most students had negative beliefs about their opportunities to learn through the ERT. These beliefs were equally presented among men and women. We identified that students in their first academic year spended more time connected to the LMS. Additionally, we observed that when students presented positive beliefs about their learning, they spent more hours connected to the LMS.

We found that students with higher achievement beliefs presented higher scores on the mathematics college entrance test (PSU). Thus, we believe that the PSU score intervened in students’ future performance beliefs. Similarly, we identified that students with low-performance beliefs at the beginning of the ERT presented lower scores at the end of the ERT semester. The students’ beliefs about learning opportunities and performance intervened in the time of interaction with the LMS, affecting the academic achievement. Thus, it is relevant for teachers and institutions to promote beliefs that can relate to positive behaviors in their students.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

The studies involving human participants were reviewed and approved by Universidad de Concepción Ethics Committee. The patients/participants provided their written informed consent to participate in this study.

Author Contributions

KL and FS-D contributed to the design of the study, literature, and writing of the manuscript. RC-R contributed to the design of the study, data analysis, and review of the abstract and manuscript. JM contributed to the data extraction, data analysis, and interpretation of the results. AM contributed to the study’s design, interpretation of the results, and the abstract and manuscript review. CB and NC contributed to the interpretation of the results and writing of the manuscript. All authors contributed to the article and approved the submitted version.

Funding

Research reported in this publication was supported by Unidad de Fortalecimiento Institucional of the Ministerio de Educación Chile, project InES 2018 UCO1808 Laboratorio de Innovación educativa basada en investigación para el fortalecimiento de los aprendizajes de ciencias básicas en la Universidad de Concepción.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Acknowledgments

We thank to Facultad de Ciencias Biológicas, Facultad de Ciencias Físicas y Matemáticas, and Facultad de Ciencias Químicas from Universidad de Concepción for their management and participation.

References

Akcil, U., and Bastas, M. (2020). Examination of university students’ attitudes towards e-learning during the COVID-19 pandemic process and the relationship of digital citizenship. Contemporary. Educ. Technol. 13:ep291. doi: 10.30935/cedtech/9341

CrossRef Full Text | Google Scholar

Alameri, J., Masadeh, R., Hamadallah, E., Ismail, H. B., and Fakhouri, H. N. (2020). Students’ perceptions of E-learning platforms (Moodle, Microsoft teams and zoom platforms) in The University of Jordan Education and its relation to self-study and academic achievement During COVID-19 pandemic. Advanced Research and Studies Journal 11, 2692–2800.

Google Scholar

Algina, J., Keselman, H., and Penfield, R. D. (2005). An alternative to Cohen’s standardized mean difference effect size: a robust parameter and confidence interval in the two independent groups case. Psychol. Methods 10:317. doi: 10.1037/1082-989X.10.3.317

PubMed Abstract | CrossRef Full Text | Google Scholar

Almusharraf, N., and Khahro, S. (2020). Students’ satisfaction with online learning experiences during the COVID-19 pandemic. Int. J. Emerg. Technol. Learn. 15, 246–267. doi: 10.3991/ijet.v15i21.15647

PubMed Abstract | CrossRef Full Text | Google Scholar

Al-Rahmi, W. M., Alias, N., Othman, M. S., Alzahrani, A. I., Alfarraj, O., Saged, A. A., et al. (2018). Use of e-learning by university students in Malaysian higher educational institutions: a case in Universiti Teknologi Malaysia. IEEE Access 6, 14268–14276. doi: 10.1109/ACCESS.2018.2802325

CrossRef Full Text | Google Scholar

Ato, M., López-García, J. J., and Benavente, A. (2013). Un sistema de clasificación de los diseños de investigación en psicología. Ann. Psychol. 29, 1038–1059. doi: 10.6018/analesps.29.3.178511

CrossRef Full Text | Google Scholar

Avendaño Rodríguez, K. C., and Magaña Medina, D. E. (2018). Elección de carreras universitarias en áreas de ciencia, tecnología, ingeniería y matemáticas (STEM): revisión de la literatura. Revista Interamericana de Educación de Adultos 40, 154–173.

Google Scholar

Bandura, A. (2012). On the functional properties of perceived self-efficacy revisited. J. Manag. 38, 9–44. doi: 10.1177/0149206311410606

PubMed Abstract | CrossRef Full Text | Google Scholar

Belford, R., and Moore, E. B. (2016). ConfChem conference on interactive visualizations for chemistry teaching and learning: an introduction. J. Chem. Educ. 93, 1140–1141. doi: 10.1021/acs.jchemed.5b00795

CrossRef Full Text | Google Scholar

Bernardo, A. B., Cerezo, R., Núñez, J. C., Tuero, E., and Esteban, M. (2015). Predicción del abandono universitario: variables explicativas y medidas de prevención. Revista Fuentes 16, 63–84. doi: 10.12795/5.i16.063revistafuentes.2013

CrossRef Full Text | Google Scholar

Bernardo, A., Esteban, M., Fernández, E., Cervero, A., Tuero, E., and Solano, P. (2016). Comparison of personal, social and academic variables related to university drop-out and persistence. Front. Psychol. 7:1610. doi: 10.3389/fpsyg.2016.01610

PubMed Abstract | CrossRef Full Text | Google Scholar

Borzone, M. (2017). Autoeficacia y vivencias académicas en estudiantes universitarios. Acta Colomb. Psicol. 20, 266–274. doi: 10.14718/ACP.2017.20.1.13

CrossRef Full Text | Google Scholar

Bozkurt, A., and Sharma, R. C. (2020). Emergency remote teaching in a time of global crisis due to CoronaVirus pandemic. Asian J. Distance Educ. 15, i–vi. doi: 10.5281/zenodo.3778083

CrossRef Full Text | Google Scholar

Brunner, J.-J., Ganga-Contreras, F., and Labraña-Vargas, J. (2020). Universidad y protesta social: una reflexión desde Chile. Rev. Iberoam. Edu. Super. 11, 3–22. doi: 10.22201/iisue.20072872e.2020.32.814

CrossRef Full Text | Google Scholar

Bustamante, R. (2020). Educación en cuarentena: cuando la emergencia se vuelve permanente (segunda parte). Available at: http://www.grade.org.pe/creer/archivos/articulo-4.pdf

Google Scholar

Cai, Z., Fan, X., and Du, J. (2017). Gender and attitudes toward technology use: a meta-analysis. Comput. Educ. 105, 1–13. doi: 10.1016/j.compedu.2016.11.003

CrossRef Full Text | Google Scholar

Camfield, E. K., Schiller, N. R., and Land, K. M. (2021). Nipped in the bud: COVID-19 reveals the malleability of STEM student self-efficacy. Cbe-life sciences. Education 20:ar25. doi: 10.1187/cbe.20-09-0206

CrossRef Full Text | Google Scholar

Cannon, M. B., Cohen, A. S., and Jimenez, K. N. (2021). Connecting native students to STEM research using virtual archaeology. Adv. Archaeol. Pract. 9, 175–185. doi: 10.1017/aap.2021.2

CrossRef Full Text | Google Scholar

Carter, R. A., Rice, M., Yang, S., and Jackson, H. A. (2020). Self-regulated learning in online learning environments: strategies for remote learning. Inf. Learn. Sci. 121, 321–329. doi: 10.1108/ILS-04-2020-0114

PubMed Abstract | CrossRef Full Text | Google Scholar

Casanova, J. R., Fernandez-Castañon, A. C., Pérez, J. C. N., Gutiérrez, A. B. B., and Almeida, L. S. (2018). Abandono no Ensino Superior: Impacto da autoeficácia na intenção de abandono. Rev. Bras. Orientac. Prof. 19, 41–49. doi: 10.26707/1984-7270/2019v19n1p41

CrossRef Full Text | Google Scholar

Chaka, C. (2020). Higher education institutions and the use of online instruction and online tools and resources during the COVID-19 outbreak - An online review of selected U.S. and SA’s universities. Research Square Platform LLC [Preprint]. doi: 10.21203/rs.3.rs-61482/v1

CrossRef Full Text | Google Scholar

Chen, B., Bastedo, K., and Howard, W. (2018). Exploring best practices for online STEM courses: active learning, Interaction & Assessment Design. Online Learn. 22, doi: 10.24059/olj.v22i2.1369

CrossRef Full Text | Google Scholar

Cicha, K., Rizun, M., Rutecka, P., and Strzelecki, A. (2021). COVID-19 and higher education: first-year students’ expectations toward distance learning. Sustainability 13:1889. doi: 10.3390/su13041889

CrossRef Full Text | Google Scholar

De las Cuevas Catresana, J. L., and González de Rivera, J. (1992). Autoinformes y respuestas sesgadas. Anales de Psiquiatría 8, 362–366.

Google Scholar

Dean, C. (2019). Backward Design Plus: Taking the Learning Context into Consideration. in E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2019, New Orleans, Louisiana, United States.

Google Scholar

Fleischer, J., Leutner, D., Brand, M., Fischer, H., Lang, M., Schmiemann, P., et al. (2019). Vorhersage des Studienabbruchs in naturwissenschaftlich-technischen Studiengängen. Z Erziehungswiss 22, 1077–1097. doi: 10.1007/s11618-019-00909-w

CrossRef Full Text | Google Scholar

Fox, J., and Weisberg, S. (2018). An R Companion to Applied Regression. Sage publications.

Google Scholar

García-Zubía, J., Cruz, E., Gil, J. L. R., Jayo, U. H., Martínez, I. A., Fernández, P. O., et al. (2018). WebLab-Boole-Deusto: Plataforma web para el diseño y test en laboratorio remoto de sistemas digitales combinacionales básicos. in Tecnología, Aprendizaje y Enseñanza de la Electrónica: Actas del XIII Congreso de Tecnología, Aprendizaje y Enseñanza de la Electrónica, Tenerife, 20–22 de junio, 2018.

Google Scholar

Godwin, K. E., Seltman, H. J., Almeda, M. V., Kai, S., Baker, R. S., and Fisher, A. V. (2016). The variable relationship between on-task behavior and learning. Cogn. Sci.

Google Scholar

Gonçalves, S. P., Sousa, M. J., and Pereira, F. S. (2020). Distance learning perceptions from higher education students: the case of Portugal. Educ. Sci. 10:374. doi: 10.3390/educsci10120374

CrossRef Full Text | Google Scholar

Gopal, R., Singh, V., and Aggarwal, A. (2021). Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19. Educ. Inf. Technol. doi: 10.1007/s10639-021-10523-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Hernández-De-Menéndez, M., Vallejo Guevara, A., Tudón Martínez, J. C., Hernández Alcántara, D., and Morales-Menendez, R. (2019). Active learning in engineering education. A review of fundamentals, best practices, and experiences. Int. J. Interact. Des. Manuf. 13, 909–922. doi: 10.1007/s12008-019-00557-8

CrossRef Full Text | Google Scholar

Hodges, C., Moore, S., Lockee, B., Trust, T., and Bond, A. (2020). The difference between emergency remote teaching and online learning. Educ. Rev. 27, 1–12.

Google Scholar

Honicke, T., and Jaclyn, B. (2016). The influence of academic self-efficacy on academic performance: a systematic review. Educ. Res. Rev. 17, (Suppl. C), 63–84. doi: 10.1016/j.edurev.2015.11.002

CrossRef Full Text | Google Scholar

Honkimaki, S., and Tynjala, P. (2018). Prerequisites for the successful group mentoring of first-year university students: a case study. Mentor Tutoring 26, 148–164. doi: 10.1080/13611267.2018.1471338

CrossRef Full Text | Google Scholar

Hou, Y., Muheidat, F., Usher, T., Prado, W., Guo, X., and Wart, M. V. (2021). “Evaluation of the COVID-19 shock on STEM laboratory courses.” in 2021 IEEE Global Engineering Education Conference (EDUCON); April 21–23, 2021.

Google Scholar

Ifenthaler, D. (2020). Supporting higher education students through analytics systems. J. Appl. Res. High. Educ. 12, 1–3. doi: 10.1108/JARHE-07-2019-0173

CrossRef Full Text | Google Scholar

Ifenthaler, D., and Yau, J. Y.-K. (2020). Utilising learning analytics to support study success in higher education: a systematic review. Educ. Technol. Res. Dev. 68, 1961–1990. doi: 10.1007/s11423-020-09788-z

CrossRef Full Text | Google Scholar

John, R. (2021). Canvas LMS Course Design. Packt Publishing Ltd.

Google Scholar

Jungert, T., Hubbard, K., Dedic, H., and Rosenfield, S. (2019). Systemizing and the gender gap: examining academic achievement and perseverance in STEM. Eur. J. Psychol. Educ. 34, 479–500. doi: 10.1007/s10212-018-0390-0

CrossRef Full Text | Google Scholar

Katz, V. S., Jordan, A. B., and Ognyanova, K. (2021). Digital inequality, faculty communication, and remote learning experiences during the COVID-19 pandemic: a survey of U.S. undergraduates. PLoS One 16:e0246641. doi: 10.1371/journal.pone.0246641

PubMed Abstract | CrossRef Full Text | Google Scholar

Klašnja-Milićević, A., Ivanović, M., and Budimac, Z. (2017). Data science in education: big data and learning analytics. Comput. Appl. Eng. Educ. 25, 1066–1078. doi: 10.1002/cae.21844

PubMed Abstract | CrossRef Full Text | Google Scholar

Kostagiolas, P., Lavranos, C., and Korfiatis, N. (2019). Learning analytics: survey data for measuring the impact of study satisfaction on students’ academic self-efficacy and performance. Data Brief 25:104051. doi: 10.1016/j.dib.2019.104051

PubMed Abstract | CrossRef Full Text | Google Scholar

Kovanović, V., Gašević, D., Dawson, S., Joksimović, S., Baker, R. S., and Hatala, M. (2015). “Penetrating the black box of time-on-task estimation.” in Proceedings of the Fifth International Conference on Learning Analytics and Knowledge, 184–193.

Google Scholar

Larrabee Sønderlund A. Hughes E. Smith J. (2019). The efficacy of learning analytics interventions in higher education: A systematic review. Br. J. Educ. Technol. 50, 2594–2618. doi: 10.1111/bjet.12720

CrossRef Full Text | Google Scholar

Liz-Domínguez, M., Caeiro-Rodríguez, M., Llamas-Nistal, M., and Mikic-Fonte, F. A. (2019). Systematic literature review of predictive analysis tools in higher education. Appl. Sci. 9:5569. doi: 10.3390/app9245569

PubMed Abstract | CrossRef Full Text | Google Scholar

Lobos Peña, K., Bustos-Navarrete, C., Cobo-Rendón, R., Fernández Branada, C., Bruna Jofré, C., and Maldonado Trapp, A. (2021). Professors’ expectations about online education and its relationship with characteristics of university entrance and students’ academic performance during the COVID-19 pandemic. Front. Psychol. 12:642391. doi: 10.3389/fpsyg.2021.642391

PubMed Abstract | CrossRef Full Text | Google Scholar

McGrath, C., Palmgren, P. J., and Liljedahl, M. (2021). Beyond brick and mortar: staying connected in post-pandemic blended learning environments. Med. Educ. 55, 890–891. doi: 10.1111/medu.14546

PubMed Abstract | CrossRef Full Text | Google Scholar

Miller, L. D., and Soh, L. (2013). “Significant predictors of learning from student interactions with online learning objects.” in 2013 IEEE Frontiers in Education Conference (FIE); October 23–26, 2013.

Google Scholar

MINEDUC (2020). Criterios de evaluación, calificación y promoción de estudiantes de 1° básico a 4° año medio. Available at: https://www.mineduc.cl/wp-content/uploads/sites/19/2020/08/CriteriosPromocionEscolarCalificacionEvaluacion.pdf (accessed on August 30, 2021).

Google Scholar

Morales Quiroga, M. (2020). Estallido social en Chile 2019: participación, representación, confianza institucional y escándalos públicos. Análisis Político 33, 3–25. doi: 10.15446/anpol.v33n98.89407

CrossRef Full Text | Google Scholar

Ofosu-Asare, Y., Essel, H. B., and Bonsu, F. M. (2019). E-learning graphical user interface development using the ADDIE instruction design model and developmental research: the need to establish validity and reliability. Journal of Global Research in Education and Social Science 13, 78–83.

Google Scholar

Orduña, P., Garcia-Zubia, J., Rodriguez-Gil, L., Angulo, I., Hernandez-Jayo, U., Dziabenko, O., et al. (2018). “The weblab-deusto remote laboratory management system architecture: achieving scalability, interoperability, and federation of remote experimentation,” in Cyber-Physical Laboratories in Engineering and Science Education (Springer), 17–42.

Google Scholar

Orduña, P., Irurzun, J., Rodriguez-Gil, L., Garcia-Zubia, J., Gazzola, F., and López-de-Ipiña, D. (2011). Adding new features to new and existing remote experiments through their integration in weblab-deusto. Int. J. Online Eng. 7, (S2), 33–39. doi: 10.3991/ijoe.v7iS2.1774

CrossRef Full Text | Google Scholar

Pappas, I. O., and Giannakos, M. N. (2021). Rethinking learning design in IT education during a pandemic. Front. Educ. 6:652856. doi: 10.3389/feduc.2021.652856

CrossRef Full Text | Google Scholar

Qetesh, M. I., Saadh, M. J., Kharshid, A. M., and Acar, T. (2020). Impact of the Covıd-19 pandemic on academic achievement and self-regulated learning behavior for students of the faculty of pharmacy, Middle East university. Multicult. Educ. 6, doi: 10.5281/zenodo.4291130

CrossRef Full Text | Google Scholar

Rager, L. E. (2020). The Impact of COVID-19 on Recruitment, Enrollment, and Freshman Expectations in Higher Education Youngstown State University. Available at: http://rave.ohiolink.edu/etdc/view?acc_num=ysu1597095836717798

Google Scholar

Redondo-Gutiérrez, L., Corrás, T., Novo, M., and Fariña, F. (2017). Academic achievement: the influence of expectations, optimism and self-efficacy. Revista de Estudios e investigación en Psicología y Educación 104–108. doi: 10.17979/reipe.2017.0.10.2972

PubMed Abstract | CrossRef Full Text | Google Scholar

Richardson, M., Abraham, C., and Bond, R. (2012). Psychological correlates of university students’ academic performance: a systematic review and meta-analysis. Psychol. Bull. 138:353. doi: 10.1037/a0026838

PubMed Abstract | CrossRef Full Text | Google Scholar

Rivera-Vargas, P., Anderson, T., and Cano, C. A. (2021). Exploring students’ learning experience in online education: analysis and improvement proposals based on the case of a Spanish open learning university. Educ. Technol. Res. Dev. doi: 10.1007/s11423-021-10045-0 [Epub ahead of print]

PubMed Abstract | CrossRef Full Text | Google Scholar

Rojas-Castro, P. (2017). Learning analytics: una revisión de la literatura. Educ. Educ. 20, 406–127.

Google Scholar

Romero, C., and Ventura, S. (2020). Educational data mining and learning analytics: an updated survey. WIREs Data Min. Knowl. Discov. 10:e1355. doi: 10.1002/widm.1355

CrossRef Full Text | Google Scholar

Shawaqfeh, M. S., Al Bekairy, A. M., Al-Azayzih, A., Alkatheri, A. A., Qandil, A. M., Obaidat, A. A., et al. (2020). Pharmacy students perceptions of their distance online learning experience during the COVID-19 pandemic: a cross-sectional survey study. J. Med. Educ. Curric. Dev. 7:238212052096303. doi: 10.1177/2382120520963039

PubMed Abstract | CrossRef Full Text | Google Scholar

Soler, M. G., and Dadlani, K. (2020). Resetting the Way We Teach Science is Vital for All Our Futures. Word Economic Forum. Available at: https://www.weforum.org/agenda/2020/08/science-education-reset-stem-technology (Accessed August 26, 2021).

Google Scholar

Thode, H. (2002). Testing for Normality. New York: Marcel Dekker Inc., 99–123.

Google Scholar

Tinto, V. (1975). Dropout from higher education: a theoretical synthesis of recent research. Rev. Educ. Res. 45, 89–125. doi: 10.3102/00346543045001089

CrossRef Full Text | Google Scholar

Tinto, V. (2017). Through the eyes of students. J. Coll. Stud. Retent.: Res. Theory Pract. 19, 254–269. doi: 10.1177/1521025115621917

PubMed Abstract | CrossRef Full Text | Google Scholar

UNESCO (2019). L’Oréal-UNESCO For Women in Science Programme. International Rising Talents. Available at: https://www.forwomeninscience.com/authority/international-rising-talents

Google Scholar

University Colorado Boulder (2021) PhET: Interactive Simulations. Available at: https://phet.colorado.edu/en/research (Accessed August 30, 2021).

Google Scholar

Valle, A., Regueiro, B., Rodríguez, S., Piñeiro, I., Freire, C., Ferradás, M., et al. (2015). Perfiles motivacionales como combinación de expectativas de autoeficacia y metas académicas en estudiantes universitarios. Eur. J. Educ. Psychol. 8, 1–8. doi: 10.1016/j.ejeps.2015.10.001

CrossRef Full Text | Google Scholar

Van den Hurk, A., Meelissen, M., and van Langen, A. (2019). Interventions in education to prevent STEM pipeline leakage. Int. J. Sci. Educ. 41, 150–164. doi: 10.1080/09500693.2018.1540897

CrossRef Full Text | Google Scholar

Van der Houwen, K., Schut, H., van den Bout, J., Stroebe, M., and Stroebe, W. (2010). The efficacy of a brief internet-based self-help intervention for the bereaved. Behav. Res. Ther. 48, 359–367. doi: 10.1016/j.brat.2009.12.009

PubMed Abstract | CrossRef Full Text | Google Scholar

Vogel-Heuser, B., Bi, F., Land, K., and Trunzer, E. (2020). Transitions in teaching mechanical engineering during COVID-19 crisis. Interact. Des. Archit. 47, 27–47.

Google Scholar

Wilcox, R. R., and Tian, T. S. (2011). Measuring effect size: a robust heteroscedastic approach for two or more groups. J. Appl. Stat. 38, 1359–1368. doi: 10.1080/02664763.2010.498507

PubMed Abstract | CrossRef Full Text | Google Scholar

Wong, B. T.-M., Li, K. C., and Choi, S. P.-M. (2018). Trends in learning analytics practices: a review of higher education institutions. Interact. Technol. Smart Educ. 15, 132–154. doi: 10.1108/ITSE-12-2017-0065

CrossRef Full Text | Google Scholar

Yuen, K. K. (1974). The two-sample trimmed t for unequal population variances. Biometrika 61, 165–170. doi: 10.1093/biomet/61.1.165

CrossRef Full Text | Google Scholar

Zhang, Y., Ghandour, A., and Shestak, V. (2020). Using learning analytics to predict students performance in moodle LMS. Int. J. Emerg. Technol. Learn. 15:102. doi: 10.3991/ijet.v15i20.15915

CrossRef Full Text | Google Scholar

Zilvinskis, J., and Willis, J. E. III (2019). Learning analytics in higher education: a reflection. InSight: A Journal of Scholarly Teaching 14, 43–54. doi: 10.46504/14201903zi

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: learning beliefs, learning analytics, university students, higher education, COVID-19

Citation: Lobos K, Sáez-Delgado F, Cobo-Rendón R, Mella Norambuena J, Maldonado Trapp A, Cisternas San Martín N and Bruna Jofré C (2021) Learning Beliefs, Time on Platform, and Academic Performance During the COVID-19 in University STEM Students. Front. Psychol. 12:780852. doi: 10.3389/fpsyg.2021.780852

Received: 21 September 2021; Accepted: 18 November 2021;
Published: 16 December 2021.

Edited by:

Adrian Castro-Lopez, University of Oviedo, Spain

Reviewed by:

Jorge Maldonado Mahauad, University of Cuenca, Ecuador
Javier Pulgar, University of the Bío Bío, Chile
Joana R. Casanova, University of Minho, Portugal

Copyright © 2021 Lobos, Sáez-Delgado, Cobo-Rendón, Mella Norambuena, Maldonado Trapp, Cisternas San Martín and Bruna Jofré. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Alejandra Maldonado Trapp, alemaldonado@udec.cl

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.