Skip to main content

ORIGINAL RESEARCH article

Front. Educ., 24 April 2023
Sec. Digital Education
This article is part of the Research Topic Educational Digital Transformation: New Technological Challenges for Competence Development View all 22 articles

Lëttëra web platform: A game-based learning approach with the use of technology for reading competence

Emilia Fernanda Leal UhligEmilia Fernanda Leal Uhlig1Carolina Garza LenCarolina Garza León1Xchitl Cruz VargasXóchitl Cruz Vargas1Sheila Hernndez FrancoSheila Hernández Franco1May Portuguez-Castro
May Portuguez-Castro2*
  • 1Prepatec Eugenio Garza Lagüera, Tecnologico de Monterrey, Monterrey, Mexico
  • 2Institute for the Future of Education, Tecnologico de Monterrey, Monterrey, Mexico

Introduction: This study explores the potential of technology, metacognition, and game-based learning to improve reading literacy in upper secondary school students. The focus is on the Lëttëra educational innovation, a web-based platform that uses game-based learning and technology to develop reading literacy.

Methods: This is a quantitative, exploratory, descriptive, and quasi-experimental study that reviewed 149 responses from high school students who took the standardized test Planea 2017. The study aimed to analyze whether using the Lëttëra platform brought a change in the students’ reading competence. The authors also examined students’ motivation toward technology, the platform interface, and the game. The data was analyzed both descriptively and inferentially.

Results: The results showed that using the Lëttëra platform significantly improved students’ competencies in literary text, information construction, and argumentative text. It also increased their motivation toward the proposed activities.

Discussion: This study demonstrates that integrating technology and game-based learning into reading instruction can lead to improved reading competencies and increased motivation among students. These findings are useful for educators, curriculum developers, and policymakers who aim to enhance reading instruction by integrating technology into their teaching practices.

Conclusion: Overall, this study highlights the potential of technology and game-based learning to improve reading literacy in upper secondary school students. The Lëttëra platform provides a promising approach for enhancing reading instruction, and its integration into teaching practices can benefit students, educators, and policymakers alike.

1. Introduction

Nowadays, we live in a society where human beings are bombarded daily with visual and auditory information through various forms of media. This is precisely why reading competence for young people is vital, as they are confronted with multiple textual or graphic issues in which their reading level is the basis for their decision-making. Furthermore, reading is a fundamental skill for success in adult life (OECD, 2013). However, it is observed that acquiring these skills is challenging for young. This situation generates disinterest and frustration in them, added to the fact that the media used is not attractive for their age and that their interests have been changing due to the availability of new technologies.

The emergence of e-books, audiobooks and other digital reading platforms has brought about a considerable change in the way individuals read. Technology has advanced our reading habits and sparked interest in both young people and adults (U.S. Department of Education, National Center for Education Statistics, 2018). Sun et al. (2021) indicate that interactive experiences, such as online book clubs and digital storytelling apps, can increase curiosity toward reading for all ages. Additionally, during a pandemic like COVID-19 where social distancing impacts education, teachers are using educational projects that incorporate technology to develop students’ reading skills while bringing them closer together (Grynyuk et al., 2022).

According to the most recent results of the Organization for Economic Cooperation and Development (OECD) Programme for International Student Assessment (PISA), applied in 2018 because due to the Covid-19 Pandemic (the 2021 one was postponed to 2022), Mexican students scored below the international average. According to Salinas et al. (2018), “in Mexico, only 1% of students performed at the highest proficiency levels (level 5 or 6) in at least one area (OECD average: 16%) and 35% of students did not obtain a minimum level of proficiency (Level 2) in all 3 areas (OECD aver-age: 13%)” (p. 3). Although several efforts have been made at the national level to increase reading literacy among students aged 15 and older, average performance has remained stable in reading, mathematics, and science throughout most of Mexico’s participation in PISA (Salinas et al., 2018). Since 2000, when Mexico first participated in this test, there has been no progress in any of the assessed areas.

Assessment is one of the essential features to ensure the quality of education systems in developed countries. Although efforts have been made in Mexico to assessment for primary and secondary education in the last two decades through three standardized tests: EXCALE, Planea and PISA (Caracas Sánchez and Ornelas Hernández, 2019) and the National Institute for the Evaluation of Education (INEE) by publishing studies and literature on the subject to reinforce practical exercises that have an impact on increasing reading comprehension, the student’s results have remained the same. Another important aspect is support for teachers on motivation, knowledge, and tools to make them aware of their transformative power and the need to use ICTs to achieve learning (UNESCO, 2017).

For this reason, encouraging reading competence and developing educational innovation based on games and technology to help improve this skill in high school students. In this sense, the development of reading literacy in the high school period is transcendental for them to become actively involved in society, considering that having deficiencies in reading skills can limit their potential for the future (Sucena et al., 2022). Reading literacy is defined as understanding, evaluating, reflecting on, and engaging with texts to achieve one’s goals, develop knowledge and personal potential and participate in society (OECD, 2018). However, the scientific literature related to the use of ICT in reading processes is still scarce (Fernández Batanero et al., 2021), even though the use of digital tools and active pedagogical strategies favor the development of these competencies (Neira-Piñeiro, 2015; Badillo-Jiménez and Iguarán-Jiménez, 2020).

This study seeks to analyze how technology used in an entertaining way (game-based learning) helps to visualize metacognition and makes the development of critical thinking necessary to foster reading competence in a user-friendly and us-er-friendly way self-manageable. Therefore, the research objectives were: (1) Analyze how the use of technology in the form of games can improve the reading competence of high school students in Mexico; (2) Evaluate the impact of the use of the web platform Lëttëra on the reading competence of students, and (3) Identify how students perceive the use of technology in reading learning and whether this affects their motivation and satisfaction with the learning process. These objectives have has been addressed through a comparative study of the results obtained from applying the Planea 2017 test before and after using the web platform Lëttëra, a web platform designed according to PISA’s proposed reading comprehension processes and performance levels. The variables measured were based on the result of academic performance with the Planea 2017 test and student satisfaction, assessed through a survey.

Although there are some studies that demonstrate the benefits of using digital tools and active pedagogical strategies to promote reading, there has not been enough research on how game-based learning can improve metacognition and critical thinking in secondary school students, which is considered fundamental for the development of reading competence. In addition, many studies conducted in Mexico have shown that students’ results in standardized reading tests have been insufficient, indicating the need to implement new pedagogical strategies and tools to improve the quality of education and the development of reading competence. This study can be useful for educators, educational technology designers, researchers in the field of education, and decision-makers in educational policy who are interested in improving the reading competency of secondary school students in Mexico using technology, particularly online educational platforms. Additionally, the study’s findings can be helpful for parents and students interested in using technology effectively to enhance reading learning.

2. Literature review

2.1. Programme for international student assessment test

The design of the reading comprehension exercises of the Lëttëra Web platform is based on the PISA test based on the rationale that the current assessments in Mexico are aligned with it. In addition, it represents a commitment by the governments of OECD countries to regularly monitor the performance of education systems in terms of student achievement within a common internationally agreed framework (OECD, 2018).

The PISA test aims to assess whether 15-year-old students have acquired the knowledge and skills to participate fully in a knowledge-based society. The test is developed and coordinated by the OECD and evaluates students’ abilities in reading, mathematics, science, collaborative problem-solving, financial literacy, and global competence (U.S. Department of Education, National Center for Education Statistics, 2018). The tests are administered to a sample of students from each participating country, and the results provide an opportunity for countries to compare their educational systems with those of Civini (2019). It is therefore an important tool for policymakers, educators, and researchers to evaluate the effectiveness of educational systems in different. It is administered every 3 years and focuses on three core areas: reading, mathematics, and science (Bohrnstedt and Stancavage, 2016).

The definition of reading has changed over time, mainly as the internet has generated new ways of reading and is now seen “as an expanding body of knowledge, skills and strategies that individuals construct over a lifetime in diverse contexts, through interaction with peers and the wider community” (OECD, 2018, p. 9). In this way, PISA organizes reading literacy into three dimensions: text, situations, and processes.

The way content is presented in texts determines how they are managed. This can either be continuous, where sentences and paragraphs form broader structures like articles, essays or stories; non-continuous, organized from non-sequential information such as diagrams, infographics or advertisements; or mixed when combining the two forms (OECD, 2017). To cater for target audience and purpose of creation, texts are categorized into personal, public educational and occupational situations. The interaction between readers and text determines cognitive processes which include locating information comprehension evaluation, and reflection on it (OECD, 2018).

From 2018 onwards, this assessment has included texts presented digitally. These texts are classified according to (1) how information is accessed (static or dynamic); (2) the amount of information displayed (single or multiple single sources with two or more sources); (3) their format (continuous, non-continuous, and mixed) and (4) their discursive purposes (narrative, expository, argumentative, prescriptive, and transactional).

The Lëttëra web platform uses continuous, discontinuous, and mixed texts. It is based on the cognitive processes 1c to 3 suggested by PISA (National Commission for the Continuous Improvement of Education, 2018). For the underlining and commenting part of the reading, through distinguishing between different colors, to metacognitively direct the reading competence, as well as in the design of the reading challenges or exercises, as well as providing users with immediate feedback, relating the performance levels to the PISA reading competence as shown in Table 1.

TABLE 1
www.frontiersin.org

Table 1. Reading literacy performance levels, PISA 2018 (National Commission for the Continuous Improvement of Education, 2018, p. 44).

The most basic levels are the first five:1c, 1b, 1a, 2, and 3; the highest performance levels are 4, 5 and 6. This first edition of the Lëttëra Web Platform is designed to help users develop their skills on the first five levels.

2.2. Assessing reading literacy through standardized tests in Mexico

According to Caracas Sánchez and Ornelas Hernández (2019), in Mexico, evaluation played a relevant role in the 1980s. Evaluation through standardized tests was a strategy used to improve the country’s education system due to educators’ high attrition rate and low efficiency. In that decade, Mexico created the National Evaluation Centre for Higher Education (CENEVAL) to regulate the evaluation of education in the country, as the International Monetary Fund and the World Bank assigned economic resources for teaching in Mexico as support to settle the country’s foreign debt (Aranda Izguerra, 2005).

One of the conditions for allocating resources was the commitment to improve education. So, through the design and application of standardized tests such as Planea or PISA, the country would substantially improve the education system. Starting in 2000, the OECD, through PISA, began the application of international tests for 15-year-old students. Mexico requested that the PISA test be applied in educational institutions to improve education and fulfil its commitment. The PISA test focused on finding performance indicators in mathematics, science and reading. This assessment does not focus on the curricula of the participating countries but on the progress of young people coping with the knowledge society (Jiménez Moreno, 2016).

2.3. Self-directed learning

A self-directed learner is to identify and achieves goals through effective learning strategies to understand, monitor, direct, evaluate and reflect on their process, ultimately taking control that enables them to decide which methods to use. Students must be self-directed learners to experience effective education and lifelong learning (Bagheri et al., 2013).

The Lëttëra web platform promoted the students’ self-directed use of the exercises. The platform had an immediate feedback system that allowed students to learn about the cognitive process required to answer each question and the justification that determined the correct answer for each item. Another factor that eased the self-directed use of the platform was the challenge map that visually placed the students’ progress to achieve all the activities. The last factor was determining the time to complete the activities since, during the semester, the students had some weeks to complete all the reading exercises.

2.4. Game-based learning

The design of the Lëttëra web platform considered the components that provided an attractive learning environment for high school students. These components were: the use of technology, gamification, self-directed learning, and metacognition through the underlining of PISA cognitive processes in the texts. Gamification refers to the incorporation of game design elements such as point systems, leaderboards, and rewards into non-gaming contexts (Høiseth et al., 2021). The goal of gamification is to increase motivation, engagement and participation in activities that are not inherently fun or interesting by making them more enjoyable through game-like (Bicen et al., 2022) Lettera web is based mainly on the gamification methodology. Games have long been known in the educational world for their effectiveness in applying goals such as having fun, socializing, and learning, and consequences such as winning and losing according to a specific rule system (Baran et al., 2018). In recent years, gamification has gained more popularity among teachers due to the gamified designs and game mechanics added to a non-game process (Wong et al., 2022). This has been embraced across a range of fields including education, healthcare, marketing, and customer service.

In general terms, a game is an application, while gamification is a process where game components are integrated into a non-game environment (Attali and Arieli-Attali, 2015; Abdul Ghani et al., 2022). The main objective of gamification is to effectively implement the positive effects of gaming in educational environments to increase student engagement, stimulate their educational participation, and improve outcomes (Deterding et al., 2011). Gamification seeks to harness the power of games to solve real-world problems (Hüseyin et al., 2020), making it the most appropriate approach to use on the Lëttëra web.

Different studies have analyzed the use of gamification in the school environment. In the last 2 years, due to the COVID-19 pandemic, initiatives have emerged that seek to improve student motivation in virtual environments due to social distancing (Chans and Portuguez Castro, 2021). Singh et al. (2021) identified that online activities require greater self-regulation and motivation for students to participate, so technology plays a fundamental role in improving methodologies; they also suggest that creating gamified environments enhances interaction and collaborative learning.

Another study by Chans and Portuguez Castro (2021) was conducted in the Mexican context, where students could carry out gamification activities in chemistry classes. The results showed that gamification elements such as autonomy and feedback allowed for the development of intrinsic motivation of learners. In the case of language teaching, Alharbi and Khalin (2022) mention that gamification through digital platforms gives the student more time to interact. If they receive immediate feedback, it engages them more to continue learning.

Game-based learning is another differentiator proposed by Lëttëra. This methodology has an impact on the students’ motivation, making the didactic model more meaningful and positioning the young person as the protagonist of the learning (Cueva Gaibor, 2020).

The gamification in the web platform Lëttëra consisted of a process in which there are game rules to achieve the challenges of obtaining the highest number of badges. Each student can engage with each challenge twice. Also, this is displayed on a map that displays the progress in each challenge (reading exercises), encouraging continuous improvement through feedback presented at the end of each challenge. The texts of the reading exercises were selected according to different areas of knowledge and in other formats, as established by PISA, continuous and non-continuous texts (INEE, 2018). The various topics presented in the texts allowed students to learn about different social, economic, and cultural issues according to the type of text, thus linking reading with the context of real life in a gamified environment.

2.5. Metacognition

The concept of metacognitive monitoring emerged in the 1970s. Pioneering work on metacognitive monitoring by John Flavell determined the stage of this construct by describing the development of aspects of how a person reflects or thinks about their cognition (Crespo, 2000). Flavell defined the concept of metacognition as “thinking about thinking.” He divided the idea of metacognition into four main aspects: metacognitive knowledge, metacognitive experience, objectives or goals, and strategies.

At a broad level, the basis of metacognition is in the individual’s mind. Metacognition has been positioned as Moshman (2008) classified as endogenous constructivism. Metacognition relates to the abstract reflection of new or existing cognitive structures. In this sense, metacognition emphasizes learning development rather than the learner’s interaction with the environment (Dinsmore et al., 2008).

The intention of the Lëttëra web platform to contribute to the development of students’ metacognition is a priority to equip them as critical readers who can transfer these skills to other learning areas or situations. By visually providing different colored underlining according to the cognitive processes proposed by the PISA test. This test evaluates the cognitive processes: locating information, understanding, assessing, and reflecting. For each of these three classifications of cognitive processes, a different underline color was assigned to allow the student to visualize the other cognitive processes to respond to the reading comprehension exercise in the text.

The design of the items is considered a predominant cognitive process to determine the correct answer. Once the students had completed the entire reading exercise, feedback was provided for each item. It explained how the cognitive process was a determining factor in selecting the correct answer to each item. In this way, the students could monitor their cognitive process and reflect on how they read, building their metacognition as they progressed through the Lëttëra challenges. The cognitive processes exposed visually through the underlining were a constant that led to the familiarization of these aspects that we sometimes do automatically.

3. Methodology

This research corresponds to a quantitative, exploratory, descriptive, and quasi-experimental study in which a technological innovation based on games is used to develop reading skills. A standardized test was used to analyze the study variables, and a descriptive analysis was carried out of the results obtained before and after using the educational innovation.

This project was part of the Novus projects and was selected to fund and support the development of the platform for reading skills. Novus is an initiative of the Institute for the Future of Education that seeks to strengthen the culture of educational innovation based on evidence from the professors of the Tecnológico de Monterrey (Portuguez-Castro et al., 2022). As part of their impact measurement strategy, the faculty is trained and mentored to submit a research protocol for approval. Due to the amount of projects supported by the initiative, they have worked closely with the ethics committees to ensure that Novus protocols follow federal and international regulations in regards to research subjects and their integrity. During training and follow-up, Novus Mentorship team ensures everything from methods to ethics is considered in the submission. If this is the case, the proposal is approved by Novus and faculty may begin to work on their project.

The technological innovation of the web platform Lëttëra aims to create a virtual space to develop autonomy through a gaming environment, asynchronously read, and receive immediate feedback. This innovation uses a user-friendly interface with stimulating aspects that optimally favor teaching-learning. In addition, the teacher benefits from the reduction of revision work since the tool monitors the management of each student’s level, administering the exercises, evaluating, and providing feedback; thus, the student manages their learning progress.

The research seeks to analyze how technology used in game-based learning helps to visualize metacognition and makes the development of critical thinking necessary for reading literacy friendly and self-manageable. This analysis was based on a comparative analysis of the Planea 2017 test results to assess the learning and development of a life skill such as reading literacy. Specifically with the OECD’s perspective, through the PISA Test, “reading should therefore be considered through the various ways in which citizens interact with texts on various devices and how reading is part of lifelong learning” (OECD, 2018, p. 8).

3.1. Procedure

The experiment consisted of several phases. The first was administering the Planea test 2017 as a diagnosis for students before using the platform. In the second, Lëttëra was implemented in the classroom using autonomous, flexible, and enjoyable learning technology. In the third phase, the results were evaluated through a second application of the test to assess the data obtained comparatively and by cognitive processes. The data was collected through an online form before and after completing the activities on the platform. Once collected, the data was anonymized and analyzed as a whole to protect the identity of the participants. Data collection was done prospectively during the semester in which the educational intervention was carried out.

3.2. Participants

The participants in the sample were 149 first-semester students of the Eugenio Garza Lagüera High School of the Tec de Monterrey. This institution is part of a system of 26 high schools, professional and postgraduate campuses, distributed throughout Mexico. The following three types of baccalaureates are offered: bicultural, multicultural, and international (32). The inclusion criteria of this study were: (1) students who are currently enrolled in a high school program; (2) students who are willing to participate in the study and provide informed consent (and, if under 18, have parental/guardian consent), (3) students who meet the specific demographic or academic requirements of the study (e.g., age range, grade level, specific courses taken). The exclusion criteria were: (1) students who are not fluent in the language of instruction (if applicable), (2) students who do not complete the informed consent. They had been previously enrolled in private junior high schools in the metropolitan area of Monterrey, Mexico, and had upper-middle-class backgrounds. Of these 149 students, 78 are male, and 71 are female, between 15 and 17 years of age.

3.3. Instruments

The Planea 2017 test is an objective, standardized test aligned to the Common Curriculum Framework, particularly in the fields associated with the Language and Communication and Mathematics competencies for students in Mexico. It is a validated and standardized multiple-choice instrument and consists of 100 items: 50 for Language and Communication and 50 for Mathematics (Planea, 2022). Its objective is “to determine the extent to which students achieve mastery of a set of essential learning at the end of the different levels of compulsory education” (Planea, 2022, p. 9), reading comprehension, reading literacy and mathematics, through the formative field of Language and Communication.

Planea test evaluates the use of two cognitive processes: the first is reading competence, which comprises: (a) the extraction of information, (b) the development of a global comprehension, (c) the development of an interpretation, (d) the analysis of content and structure as well as: (e) critical evaluation of the text; and the second process, which assesses reflection on language, consisting of (a) semantic reflection, (b) syntactic and morphosyntactic reflection, (c) linguistic conventions and finally, (d) knowledge of sources of information. The validity of the test has been established through a thorough review of the test’s content and structure by experts in the field of education. As for reliability, the test has been subjected to statistical analysis that has demonstrated adequate internal consistency and acceptable inter-rater reliability, suggesting that the test provides consistent and accurate results (INEE, 2018).

The Language and Communication competence assess learning related to cognitive processes and knowledge for the selection, comprehension, and interpretation of texts with different characteristics, purposes, and thematic axes: argumentative, expository, and literary texts, in their continuous (text) and non-continuous (text and image) modalities, to know the mastery of the set of essential learning in this competence. The test comprises four categories: expository text, argumentative text, literary text, and construction of information, measured at four levels of achievement. These levels are described in Table 2.

TABLE 2
www.frontiersin.org

Table 2. Achievement levels of the Planea 2017 test (INEE, 2018).

The development of reading literacy in students was determined through the comparative analysis of the quantitative results of the Planea 2017 test between the initial and final application.

3.3.1. Programme for international student assessment and Planea achievement levels

The PISA test presents eight levels of achievement, from 1c to 6, in three different processes: locating information, understanding, evaluating, and reflecting. The Planea test presents four processes for reading literacy: extracting information, overall understanding, developing an interpretation, and analyzing content and structure.

The Planea Level IV correlates with PISA levels 3 and 4, specifically concerning interpreting the meaning of the nuances of language in a section of the text, demonstrating understanding in interpretive tasks, and comparing perspectives and drawing inferences based on diverse sources (National Commission for the Continuous Improvement of Education, 2018). The Lëttëra web platform focuses on the first PISA achievement levels (1c to 3), which correspond to level IV (the highest) of the Planea test.

3.3.2. Satisfaction survey

A survey was designed to determine the students’ motivation to use the Lëttëra platform. The questionnaire consisted of 17 closed questions in which they answered “true,” “false,” or “doubtful” according to their perception of the exercises and their experience with the tool. Two questions were included to identify the age and gender of the student. Fifteen closed questions would assess if they felt an impact on their confidence in reading; if using the platform motivated them to read; if the interface and the methodology used made reading easy; and if they considered it appropriate to improve their reading comprehension.

The student satisfaction survey was validated by Spanish language teachers and an education researcher to ensure the quality and reliability of the obtained results.

3.4. Description of the Lëttëra platform

Lëttëra aims to develop reading skills in young people in upper secondary education. It comprises 12 reading comprehension exercises containing literary texts and academic, journalistic, and popular science articles. These texts allow users to explore different topics, expand their vocabulary, and above all, understand, interpret, analyze, and extract information from these texts that students are likely to encounter at school and in their daily lives. Figure 1 shows the homepage of the platform.

FIGURE 1
www.frontiersin.org

Figure 1. Home page of the Lëttëra web platform.

3.4.1. Exercise examples

“Explore reading” is a core part before starting the exercises, in which the students are guided through highlighting and comments designed to enhance their critical reading skills, which help to foster metacognition and help students become critical readers, according to the PISA descriptors. These processes are color-coded, where green represents the cognitive skills of locating information, blue highlights those of understanding, and orange identifies skills related to reflection and evaluation, as well as the achievement levels of the Planea test.

Several game-based strategies were made to design a more entertaining environment within the platform; learners can choose their profile picture or avatar.

The texts are divided into three modules: mountaineering challenge, aquatic challenge, and countryside challenge, each with four challenges with reading comprehension exercises to visualize their progress on an interactive map.

3.4.2. Rewards system

The user’s effort is rewarded through a game system, which is explained in the “Explorer’s Guide” section (see Figure 2). Each player has two opportunities to tackle the reading comprehension exercise. If they get 100 points when they engage in the challenge at the first opportunity, they obtain two badges and the logo of Lëttëra, which are then colored in the “Rewards” section. If students obtain a total of fewer than 100 points but more than 70 in the first opportunity, they can: (1) engage in the reading again or (2) keep the score achieved and get two badges; if they do not pass at the first opportunity, they can do it again, and if pass (or score higher than 70) get a badge; in case they make the second attempt and do not pass, students do not obtain any badges and must move on to the next challenge with the highest score achieved.

FIGURE 2
www.frontiersin.org

Figure 2. Explorer’s guide, explaining the rules of the game.

Once each challenge or reading comprehension exercise has been completed, Lëttëra provides the user with feedback on each item, which presents the process (locating information, understanding, reflecting, and evaluating). It also indicates the correct answer, explaining why each one is in that status, intending to induce personal reflection so that learners understand what an optimal reader needs to focus on and to self-direct their learning.

The platform also has a menu that allows the learner to view: the Challenge Map, Rewards, Explorer’s Guide, Info Kiosk, Calendar, Edit Profile and Analytics, where they can check the score obtained in each of the challenges.

3.5. Data analysis

The data collected in the instruments were analyzed using descriptive statistics, graphs, and tables to present the results for each category. The total data from the Planea test were reviewed to examine whether there was a change in reading literacy results after using the Lëttëra platform. For this, the correct answer for each question in each of the applications was identified. The mean difference was then determined to establish whether there was a significant difference in the test results. The results were analyzed using Excel and Minitab v.21. The paired samples t-test was applied, given that the data came from the same subjects after the treatment (Johnson and Kurby, 2016).

The other analysis performed was for each sub-competence measured in the PISA test, for which frequency distributions and percentages were performed (Hernández Sampieri et al., 2014). Finally, the results for each participant group were compared to identify whether there were significant differences in the results obtained in the post-test after using the platform. The satisfaction questionnaire was analyzed with descriptive statistics to identify the most relevant aspects of the student’s opinions of their experience with Lëttëra.

4. Results

The results of the study aimed to answer the research objectives. For the first objective: “Analyze how the use of technology in the form of games can improve the reading competence of high school students in Mexico,” the Planea test was applied as a pre-and post-test to identify the learning gain of students due to the intervention with the platform. For the second objective, “Evaluate the impact of the use of the web platform Lëttëra on the reading competence of students,” the results of the different categories of the test were analyzed, and the ones that had the greatest changes were determined. Finally, for the third objective “Identify how students perceive the use of technology in reading learning and whether this affects their motivation and satisfaction with the learning process,” a satisfaction survey was applied to learn the students’ opinions on the impact of the use of educational innovation on their motivation.

4.1. Planea test

The Planea test is a standardized test consisting of 50 items for the subject of Spanish. The research included using this test as a diagnostic tool before students engaged in using the Lëttëra platform. It was also used after their experience to analyze the changes in reading skills that could be attributed to this educational innovation. These results aimed to contribute to the objective of analyzing how the use of technology on the platform improves the reading competencies of students.

The results of the first application of the test are shown in Figure 3. The number of students that completed the test was 149. The maximum score for the test was 50 points, the minimum score was in the range of 13–17.5 correct answers (3 students), and the maximum score was 44.5–49 (7 students). Most of the correct answers ranged from 26.5–31 (40 students) to 31.5–65.5 (40 students). The mean was 31.436, with a standard deviation of 6.890.

FIGURE 3
www.frontiersin.org

Figure 3. Distribution of Planea 1 test scores.

In the second test implementation with the same number of responses as the previous application, the minimum score was 18–22.2 (8 students), and the maximum score was 50 (1 student) (Figure 4). The highest number of correct answers was concentrated within the 34.8–39 range (46 students). The mean was 33.255, with a standard deviation of 6.396. The number of correct answers above 30 increased from 91 in test 1 to 104 in test 2.

FIGURE 4
www.frontiersin.org

Figure 4. Distribution of Planea 2 test scores.

When comparing the means of the two groups, a difference of 1.82 was identified, showing that their overall results were higher after using the Lëttëra platform. Applying the paired samples t-statistic with a confidence level of 95%, a value of p of 0.004 (p < 0.05) was obtained, indicating that there was a significant difference in the student’s scores when using the platform.

4.1.1. Analysis by student group

The results analyzed correspond to 149 student responses divided into six groups. The distribution of the groups is shown in Table 3. When comparing means for groups A, B, C, and E, there was a decrease in the score between P1 and P2 between −0.11 and-3.05. When applying the statistical test, it was determined that there were no significant differences in these groups, so there was no evidence of an effect of the educational intervention in this sample. On the other hand, there was a significant difference of 9.09 (p = 0) in group D and 6.8 (p = 0) in group F.

TABLE 3
www.frontiersin.org

Table 3. Differences between study groups.

In group D, there was a significant improvement in the mean score of 9.09 with a p = 0.000 (p > 0.05). As shown in Figure 5, most of the students in this group improved their scores after the educational intervention (87% of the students improved in the number of correct scores).

FIGURE 5
www.frontiersin.org

Figure 5. Differences in scores obtained in group D.

Group F also showed significant improvement in their results, with an increase in their scores of 6.8 with a p = 0.000 (p > 0.05). Students also improved in most of the questions in the second test application, as shown in Figure 6 (87% of the students improved in the number of correct answers).

FIGURE 6
www.frontiersin.org

Figure 6. Differences in scores obtained in group F.

When looking at the results per student, it was found that the students with the greatest improvement between the two tests belong to these groups (D and F). Student 1 obtained 26 points, student 2 of 24, and student 3 of 21. (These three students belong to group D); student 4 from group F improved by 17 points between P1 and P2.

4.1.2. Analysis according to the categories of the Planea test 2017

As mentioned above, the results of the Planea 2017 test were analyzed according to the four categories: information construction, argumentative text, expository text, and literary text. These results aimed to contribute to the objective of evaluate the impact of the use of the web platform Lëttëra on the reading competence of students. In the first test administration, the data allowed for a diagnosis of students’ prior knowledge. Students were most successful in the questions related to identifying expository texts (72.67%), followed by identifying literary texts (64.90%), information construction (64.32%), and finally, identifying argumentative texts (63.03%).

In the second application of the test, students scored the highest percentage of correct answers in the identification of literary texts (69.60%), followed by expository text (67.42%), construction of information (65.25%) and argumentative text (62.90%).

The difference between the two administrations of the test was higher in the literary text category, with an improvement of 4.70%, followed by information construction (0.93%). However, there was no improvement in the argumentative and expository text; in fact, there was a decrease in the student’s performance in these skills, as shown in Figure 7.

FIGURE 7
www.frontiersin.org

Figure 7. Differences in results P1 and P2.

For each level of achievement according to the Planea 2017 test, it was possible to identify that for test 1, the level of students was higher in the category of Expository text, reaching 85.6% in level II and 79.19% in level III of the same category, followed by 70.28% in level III of Information construction. The lowest levels in this first diagnosis were level IV of the Literary category construction (43.46%), level IV of literary text (57.94%) and level IV of argumentative text (58.17%).

In the second test, the students continued with higher results in level II of the expository text category (75.50%), followed by level III of literary text (70.18%) and level III of argumentative text (69.80%). In this last category, they followed with a low result for level IV, with a percentage of 44.52%.

When comparing both results, it was observed that for the category of Literary text (L-III), students had an improvement of 10.29% in the achievement of level IV (A-IV) when retaking the test after the educational intervention, as well as in level IV Information Construction (IC-IV) (7.89%). There was also an improvement of 4.95% in the results of level III in the argumentative text category (A-III). The results are shown in Figure 8.

FIGURE 8
www.frontiersin.org

Figure 8. Differences in P1 and P2 tests by level of attainment.

4.2. Satisfaction survey

The satisfaction survey was analyzed descriptively to identify the highest response rates for the following categories: self-confidence, motivation, interface, methodology and reading comprehension. These results aimed to contribute to the objective of identify how students perceive the use of technology in reading learning and whether this affects their motivation and satisfaction with the learning process. A survey was conducted among 149 students who, in the semester of August–December 2021, used the Lëttëra web platform, of whom more than 93% were aged 15–17, 53.6% were male, and 46.4% were female.

The survey asked about students’ self-confidence in the reading comprehension process. Most of the students considered that they could do the exercises (82.4%) and that they were good at answering questions about lectures (62.1%) and that they would understand the exercise questions (66%). The results are shown in Figure 9.

FIGURE 9
www.frontiersin.org

Figure 9. Responses related to self-confidence towards reading processes.

The interface makes it easy for the students to engage in lecture exercises (75.8%), and most consider entering Lëttëra because it is not complicated (62.7%). When asked questions about the methodology, we identified that the most positive feature they found in the platform was the support they found the platform whenever they felt lost. Reading the highlighted parts and the text indications helped them better understand (80.30%). They liked the design, rewards system, possibility of selecting a profile, and map (73%). They also acknowledged a challenge in the questions, as only 28.70% answered them correctly on the first try (see Figure 10).

FIGURE 10
www.frontiersin.org

Figure 10. Responses on the methodology used in the Lëttëra platform.

46.4% of the respondents considered working on the platforms’ activities motivating; 39.2% indicated that they would learn how to read with this platform, and the majority believed that they have high expectations of how the platform can help them improve their reading (57.90%). A high percentage of the students considered that their reading comprehension level increased with Lëttëra (77%), as shown in Figure 11. Overall, 60.10% rate Lëttëra as “good,” 28.80% as “excellent,” and 11.10% as “fair.”

FIGURE 11
www.frontiersin.org

Figure 11. Students’ motivation to use the Lëttëra platform.

5. Discussion

Developing reading competence is crucial for high school students, as critical readers must possess a considerable amount of knowledge and skills, such as observation, identifying details, relating ideas, comparing, contrasting, and inferring. The cultural context also influences the reading process. Therefore, it is essential to innovate and bring about cultural change in Mexico (INEE, 2018, p. 6). One critical way to promote this cultural change is through implementing reading programs in schools that focus on providing students with the tools they need to become better.

In this sense, this research aimed to propose an educational innovation based on gamification that would allow students to develop reading competencies through exercises that facilitated their achievement. The study’s first objective was to analyze how the use of technology in the form of games can improve the reading competence of high school students in Mexico. It was observed that after using the platform, students had higher scores on the test. As in other studies, this suggests that incorporating technology in the form of games into reading learning can be an effective way to enhance students’ reading competence (Hüseyin et al., 2020). The use of interactive and engaging activities can capture students’ attention and motivate them to learn, leading to better performance on assessments (Chans and Portuguez Castro, 2021).

For the second proposed objective, evaluate the impact of the use of the web platform Lëttëra on the reading competence of students, the study found that using the web platform Lëttëra had a positive impact on students’ reading competence. Specifically, it was observed that students showed improvement in literary text comprehension and information construction skills. This could be attributed to the platform’s accompanying comments being directly related to PISA processes and established levels for each skill. Another element to observe is that in the part of Explore reading, previous knowledge about continuous texts (argumentative, expository, literary), mixed texts (posters, receipts, infographics, among others), are strengthened Overall, this finding suggests an enhancement in metacognition among students which can help them understand and interpret different types of texts in various situations while promoting meaningful lifelong learning as pointed out by Dinsmore et al. (2008).

For the objective of identifying how students perceive the use of technology in reading learning and whether this affects their motivation and satisfaction with the learning process, it was found that most students considered that using the platform improved their level of reading comprehension and found it friendly and easy to understand. Most of the students responded positively to the platform’s design, rewards system, profile selection, and progress map, and mentioned that their educational experience on the platform was favorable. This is considered to be due to another differentiator proposed by Lëttëra, which is learning based on games; this has various repercussions on students’ perception of the task to be performed and their motivation, making the didactic model more meaningful and stronger in less time, positioning the young person as the protagonist of learning (Cueva Gaibor, 2020).

6. Conclusion

In conclusion, the study shows that incorporating technology in reading instruction, particularly using web platform Lëttëra, can have a positive impact on students’ reading competence and motivation. The novelty and significance of a study that concludes that incorporating technology in reading instruction, particularly using the web platform Lëttëra, can positively impact students’ reading competence and motivation, lies in its ability to make the didactic model more meaningful and stronger in less time by presenting comments related to established levels for each skill and learning through games, positioning the student as the protagonist of their own learning.

Another aspect is that game-based learning presenting the exercises twice, the reward system, the immediate feedback, the progress map, and other playful elements increase motivation for learning. They foster an active environment to develop skills, making the reader optimally equipped to handle reading. In this sense, even though much remains to be done, it is necessary to continue innovating and applying the knowledge acquired over time in projects that use technology and game-based learning. To respond to the needs of the new generations who interact with these tools in other contexts daily and to promote further education that synthesizes the reading process through technology and is a means of self-managed practice to develop reading skills. However, further research is needed to determine the long-term impact of using technology in reading learning and to identify the most effective types of games and activities for improving reading competence.

For future studies, it may be interesting to investigate the long-term effects of incorporating technology in reading instruction on students’ lifelong learning and their ability to transfer acquired skills to other areas of their academic and personal lives. We also suggest conducting more qualitative studies that allow for a deeper understanding of these learnings. Furthermore, future research could also explore how the use of technology in reading instruction can be adapted for students with different learning styles or those who may require additional support or accommodations. This study can be useful to educators, curriculum developers, and policymakers who seek to improve reading instruction through the integration of technology in their teaching practices.

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics statement

Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. Written informed consent to participate in this study was provided by the participants’ legal guardian/next of kin.

Author contributions

EL, CG, XC, and MP-C contributed to conception and design of the study. EL, CG, and XC organized the database. MP-C performed the statistical analysis. EL, CG, XC, and MP-C wrote the first draft of the manuscript. All authors wrote sections of the manuscript, contributed to the manuscript revision, read, and approved the submitted version.

Funding

The authors acknowledge the financial support of NOVUS (Grant number: N20-153), Institute for the Future of Education, Tecnologico de Monterrey, Mexico, in the production of this work. The authors would like to acknowledge the financial support of the Writing Lab, Institute for the Future of Education, Tecnologico de Monterrey, Mexico, in the production of this manuscript.

Acknowledgments

We acknowledge Professor María del Carmen Benítez for her help with the spelling and the grammar revision of this manuscript.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Abdul Ghani, A. S., Abdul Rahim, A. F., Bahri Yusoff, M. S., and Hanim Haide, S. (2022). Developing an interactive PBL environment via persuasive gamify elements: A scoping review. Res. Pract. Technol. Enhanc. Learn. 17:21. doi: 10.1186/s41039-022-00193-z

CrossRef Full Text | Google Scholar

Alharbi, K., and Khalil, L. A. (2022). Descriptive study of EFL teachers' perception toward e-learning platforms during the COVID-19 pandemic. Electron. J. e-Learn. 20:4. doi: 10.34190/ejel.20.4.2203

CrossRef Full Text | Google Scholar

Aranda Izguerra, J. (2005). Las relaciones de México con el Fondo Monetario Internacional. Carta de políticas públicas. Available at: http://www.economia.unam.mx/publicaciones/carta/06.html

Google Scholar

Attali, Y., and Arieli-Attali, M. (2015). Gamification in assessment: do points affect test performance? Comput. Educ. 83, 57–63. doi: 10.1016/j.compedu.2014.12.012

PubMed Abstract | CrossRef Full Text | Google Scholar

Badillo-Jiménez, V. T., and Iguarán-Jiménez, A. M. (2020). Uso de las TIC en la enseñanza-aprendizaje de la comprensión lectora en niños autistas. Praxis 16:1. doi: 10.21676/23897856.3406

CrossRef Full Text | Google Scholar

Bagheri, M., Wan Ali, W., Chong Binti, M., and Mohd Daud, S. (2013). Effects of project-based learning strategy on self-directed learning skills of educational technology students. Contemp. Educ. Technol. 4:1. doi: 10.30935/cedtech/6089

CrossRef Full Text | Google Scholar

Baran, M., Maskan, A., and Yaşar, Ş. (2018). Learning physics through project-based learning game techniques. Int. J. Instr. 11, 221–234. doi: 10.12973/iji.2018.11215a

CrossRef Full Text | Google Scholar

Bicen, H., Demir, B., and Serttas, Z. (2022). The attitudes of teacher candidates towards the gamification process in education, BRAIN. Broad Res. Artif. Intell. Neurosci. 13, 39–50. doi: 10.18662/brain/13.2/330

CrossRef Full Text | Google Scholar

Bohrnstedt, G., and Stancavage, F. (2016) TIMSS, PISA, and NAEP: what to know before digging into the results. Available at: https://www.air.org/resource/blog-post/timss-pisa-and-naep-what-know-digging-results (Accesed March 23, 2023).

Google Scholar

Caracas Sánchez, B., and Ornelas Hernández, M. (2019). The assessment of reading comprehension in Mexico. The case of the EXCALE, PLANEA and PISA tests. Perfiles Educ 41:164. doi: 10.22201/iisue.24486167e.2019.164.59087

CrossRef Full Text | Google Scholar

Chans, G. M., and Portuguez Castro, M. (2021). Gamification as a strategy to increase motivation and engagement in higher education chemistry students. Computers 10:10. doi: 10.3390/computers10100132

CrossRef Full Text | Google Scholar

Civini, C. (2019). What is the Pisa test and what does it measure? Available at: https://www.tes.com/magazine/archive/what-pisa-test-and-what-does-it-measure.pdf

Google Scholar

Crespo, N. (2000). La metacognición: las diferentes vertientes de una teoría. Rev Signos 33, 97–115. doi: 10.4067/S0718-09342000004800008

CrossRef Full Text | Google Scholar

Cueva Gaibor, D. (2020). Educational technology in times of crisis. Conrado 16, 341–348.

Google Scholar

Deterding, S., Dixon, D., Khaled, R., and Nacke, L. (2011). From game design elements to gamefulness: defining gamification. In Proceedings of the 15th international academic mind trek conference: Envisioning future media environments (mind trek '11), New York, USA

Google Scholar

Dinsmore, D. L., Alexander, P. A., and Loughlin, S. M. (2008). Focusing the conceptual lens on metacognition, self-regulation, and self-regulated learning. Educ. Psychol. Rev. 20, 391–409. doi: 10.1007/s10648-008-9083-6

CrossRef Full Text | Google Scholar

Fernández Batanero, J. M., Montenegro Rueda, M., Fernández Cerero, J., and Román Gravan, P. (2021). Impacto das TIC nas habilidades de escrita e leitura: uma revisão sistemática (2010-2020). Texto Livre 14, 1–12. doi: 10.35699/1983-3652.2021.34055

CrossRef Full Text | Google Scholar

Grynyuk, S., Kovtun, O., Sultanova, L., Zheludenko, M., Zasluzhena, A., and Zaytseva, I. (2022). Distance learning during the Covid 19 pandemic: the experience of Ukraine’s higher education system. Electron. J. e-Learn. 20:3. doi: 10.34190/ejel.20.3.2198

CrossRef Full Text | Google Scholar

Hernández Sampieri, R., Fernández Collado, C., and Baptista Lucio, P. (2014). Metodología de la investigación. Mexico City: Mcgraw Hill.

Google Scholar

Høiseth, M., Alsos, O. A., Holme, S., Ek, S., and Tendenes Gabrielsen, C. (2021). Serious game design to support children struggling with school refusal. Int. J. Serious Games 8, 109–128. doi: 10.17083/ijsg.v8i2.416

CrossRef Full Text | Google Scholar

Hüseyin, Y., Mübin, K., and Karatas, A. (2020). The views and adoption levels of primary school teachers on gamification, problems, and possible solutions. Particip. Educ. Res. 7:3. doi: 10.17275/per.20.46.7.3

CrossRef Full Text | Google Scholar

Jiménez Moreno, J. A. (2016). El papel de la evaluación a gran escala como política de rendición de cuentas en el sistema educativo mexicano. Rev. Iberoam. Eval. Educ. 9, 109–126. doi: 10.15366/riee2016.9.1.007

CrossRef Full Text | Google Scholar

Johnson, R., and Kurby, P. (2016). Estadística Elemental. Mexico City: Cengage.

Google Scholar

Moshman, D. (2008). Adolescent psychological development: Rationality, morality, and identity. London: Lawrence Erlbaum Assoc Inc.

Google Scholar

National Commission for the Continuous Improvement of Education (2018). Repensar La Evaluación Para La Mejora Educativa. Resultados de México en PISA 2018. Mexico City: Mejoredu.

Google Scholar

Neira-Piñeiro, M. (2015). Reading and writing about literature on the internet. Two innovative experiences with blogs in higher education. Innov. Educ. Teach. Int. 52:5. doi: 10.1080/14703297.2014.900452

CrossRef Full Text | Google Scholar

OECD (2013). OECD skills outlook 2013: First results from the survey of adult skills. Paris: OECD Publishing

Google Scholar

OECD (2017). PISA assessment and analysis framework for development: reading, mathematics, and science. Paris: OECD Publishing.

Google Scholar

OECD (2018). Sample items used in the PISA 2000 assessment: reading literacy, mathematics, and science. Available at: https://www.oecd.org/education/school/programmeforinternationalstudentassessmentpisa/33692793.pdf

Google Scholar

Planea (2022). Plan nacional para la evaluación de los aprendizajes. Available at: http://planea.sep.gob.mx/ms/

Google Scholar

Portuguez-Castro, M., Hernández-Méndez, R. V., and Peña-Ortega, L. O. (2022). Novus projects: innovative ideas to build new opportunities upon technology-based avenues in higher education. Educ. Sci. 12, 1–22. doi: 10.3390/educsci12100695

CrossRef Full Text | Google Scholar

Salinas, D., De Moraes, C., and Schwabe, M. (2018) Programa para la Evaluación Internacional de Alumnos (PISA) PISA 2018-Resultados. Available at: https://www.oecd.org/pisa/publications/PISA2018_CN_MEX_Spanish.pdf (Accessed December 26, 2022).

Google Scholar

Singh, P., Duggal, K., and Gupta, L. (2021). “Intrinsic and extrinsic motivation for online teaching in COVID-19: applications, issues, and solution” in Emerging Technologies for Battling COVID-19. Studies in Systems, Decision and Control. eds. F. Al-Turjman, A. Devi, and A. Nayyar (Cham: Springer)

Google Scholar

Sucena, A., Silva, A. F., and Marques, C. (2022). Reading skills intervention during the Covid-19 pandemic. Humanit. Soc. Sci. 9:45. doi: 10.1057/s41599-022-01059-x

CrossRef Full Text | Google Scholar

Sun, B., Loh, C. E., O’Brien, B. A., and Silver, R. E. (2021). The effect of the COVID-19 lockdown on bilingual Singaporean children’s leisure reading. AERA Open 7, 1–21. doi: 10.1177/23328584211033871

CrossRef Full Text | Google Scholar

UNESCO (2017). E 2030: education and skills for the 21st century. Available at: https://unesdoc.unesco.org/ark:/48223/pf0000250117 (Accessed September 18, 2022).

Google Scholar

U.S. Department of Education, National Center for Education Statistics (2018). The condition of education 2018 NCES 2018-144. Available at: https://nces.ed.gov/programs/coe/indicator/cns

Google Scholar

Wong, R., Rao, Y., Seong, L., Abd, K., Von, W., Ismail, R., et al. (2022). Gamifying education for classroom engagement in primary schools. Int. J. Eval. Res. Educ. 11, 1360–1367. doi: 10.11591/ijere.v11i3.21918

CrossRef Full Text | Google Scholar

Keywords: reading competence, game-based learning, metacognition, educational innovation, learning for life, professional education, higher education

Citation: Leal Uhlig EF, Garza León C, Cruz Vargas X, Hernández Franco S and Portuguez-Castro M (2023) Lëttëra web platform: A game-based learning approach with the use of technology for reading competence. Front. Educ. 8:1180283. doi: 10.3389/feduc.2023.1180283

Received: 05 March 2023; Accepted: 27 March 2023;
Published: 24 April 2023.

Edited by:

Antonio Palacios-Rodríguez, University of Seville, Spain

Reviewed by:

Soheil Hussein Salha, An-Najah National University, Palestine
Denok Sunarsi, Pamulang University, Indonesia

Copyright © 2023 Leal Uhlig, Garza León, Cruz Vargas, Hernández Franco and Portuguez-Castro. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: May Portuguez-Castro, may.portuguez@tec.mx

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.