
94% of researchers rate our articles as excellent or good
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.
Find out more
BRIEF RESEARCH REPORT article
Front. Educ. , 04 February 2025
Sec. Digital Education
Volume 10 - 2025 | https://doi.org/10.3389/feduc.2025.1498233
This article is part of the Research Topic Psychological Well-Being and Digitalization in Education View all 25 articles
The purpose of this study is to investigate how an adaptive assessment pathway can contribute to promoting personalized learning and improving access to education for all learners, regardless of their individual learning styles, paces, or needs. The study suggests a personalized learning assessment plan incorporating differentiated pedagogy and Universal Design for Learning (UDL). We conduct the experiment using the Moodle platform, taking advantage of Information and Communication Technologies (ICT) to reach a larger number of learners. The research used a mixed methodology to qualitatively analyze traditional learning assessment pathways’ limitations and quantitatively examine the proposed adaptive pathway’s impact on learning outcomes. While the results show a significant improvement in learning outcomes (88.9% improvement in Text study, 50% in Language activity, 55.6% in Writing, 80.6% in Total Control, and 77.8% in Oral production), the study also highlights the need for further research into the mechanisms, strategies, tools, approaches, issues, and future prospects associated with learning assessment.
Despite many years of schooling, in 2017 UNESCO declared a major problem in learning acquisition: 617 million learners are unable to acquire basic reading and mathematical skills, and the phenomenon of school wastage affects 40% of primary school learners with special needs, a percentage that rises to 55% at secondary level (Banes et al., 2020). This has necessitated the search for various approaches and means to optimize learning. In this respect the GEM report (Global Entrepreneurship Monitor, 2013) highlighted the important role of technology in improving the quality of learning. However, the impact of technology on education in terms of equity and inclusion varies considerably. Therefore, the GEM report highlighted the need for a balanced approach to integrating technology into education. The Global Monitoring Report on Education therefore emphasized two key points. The first is to guide education systems to prioritize the interests of learners. The second is to use digital technologies to foster human interaction. In the field of inclusive education, it is crucial to diversify information representation methods to meet the variety of learners’ needs and preferences (Almumen, 2020). Given that conventional approaches have demonstrated limitations in terms of teaching effectiveness and skills acquisition, the (Smale-Jacobse et al., 2019) study focuses on differentiated teaching as a critique of the pedagogical uniformity of conventional methods, and also as a path toward inclusive education that values and recognizes the diversity of learners (Bondie et al., 2019). Highlighting the potential of differentiated pedagogy to personalize teaching according to learners’ needs, this study calls for further, more rigorous studies to assess the effectiveness of differentiation and to address the limitations of one-size-fits-all teaching methods. According to (Cook and Rao, 2018; García-Campos et al., 2020; Odier-Guedj et al., 2023), the inclusive educational environment is characterized by broad pedagogical differentiation and a universal conception of learning, which converges with the idea of (Rogers-Shaw et al., 2018). This study focuses on the application of different approaches simultaneously to provide an effective learning environment, with the aim of proposing alternatives for the perception and comprehension of information. According to (Rogers-Shaw et al., 2018), learners can access information presented through visual, auditory or interactive media, in a way that respects their learning pace. Therefore, it is crucial to adapt pedagogical content to respect learners’ diversity, and this presents a challenge that has emerged through studies that are interested in evaluating learners fairly.
Skills assessment is a fundamental pillar of inclusive education. Many researchers emphasize that the diversity of learner profiles, in terms of needs, learning styles and abilities, must be at the heart of pedagogical practices (Banes et al., 2020). The Universal Design for Learning (UDL) approach is based on principles that aim to eliminate barriers to learning by adopting a variety of strategies and tools. In particular, it stresses the importance of diversifying resources and pedagogical approaches to ensure equitable access to educational content (Hehir, 2002). Meanwhile, differentiated pedagogy focuses on adapting assessment methods to individual needs, taking into account the initial skills and specific goals of each learner (Wormeli, 2023; Van Geel et al., 2019). However, although both approaches are recognized as essential levers for inclusive education, few works integrate these models into a unified framework (Delaney and Hata, 2020; Kusumaningsih, 2021).
Digital platforms, particularly Learning Management Systems (LMS), offer powerful tools for adapting and personalizing assessment. Moodle, as a widely adopted open source platform, stands out for its functionalities aligned with the principles of UDL and differentiated pedagogy. Indeed, several studies have shown that Moodle improves learner satisfaction, performance and engagement thanks to its collaborative tools and pedagogical adaptation options (Gamage et al., 2022; Kaiss et al., 2023; Safsouf et al., 2020; Yilmaz, 2022; Evardo Jr and Itaas, 2024). In particular, Moodle enables:
• Course adaptability: thanks to built-in algorithms and access restrictions, the platform offers differentiated assessments based on learner performance (Babo et al., 2020).
• A diversity of assessment formats: text, video, images and other media to suit individual learner preferences, in line with UDL principles (Alves et al., 2013).
• Pedagogical data management: facilitating progress monitoring and personalized feedback (Kennel et al., 2021).
However, although Moodle is frequently highlighted for its advantages, it is important to point out that other LMSs also have similar functionalities. A comparative exploration could help confirm whether the results observed in this study are generalizable to other technological contexts. Despite the many studies highlighting the benefits of LMSs, several gaps remain. Research often focuses on the overall impact of platforms on learner performance and engagement, without examining in detail the assessment models they offer. In addition, the use of Moodle for scenarios combining UDL and differentiated pedagogy remains little explored, although preliminary work (Al-Azawei et al., 2017) recommends developing integrated experimental approaches. Furthermore, the evaluation approaches adopted in these studies do not systematically address how the results could be replicated or adapted to other LMSs. This gap raises the need for comparative research to evaluate the effectiveness of similar models on a variety of platforms.
This study adopts a mixed research methodology, combining a quantitative and qualitative approach, with the aim of examining the effectiveness of an adaptive pathway in skills assessment. This pathway simultaneously integrates the principles of Universal Design for Learning (UDL) and those of differentiated pedagogy, and implemented via the Moodle platform. The aim is to determine the impact of this integrated approach on improving learning outcomes, based on feedback from participants.
This study carried out on a population of 36 learners from the same 2nd year baccalaureate class in physical sciences, more precisely during the teaching of two different modules of the French course. The selection of this class as the experimental sample motivated by the following reasons:
• Accessibility and feasibility: The pupils in this class were available to participate regularly in the experimental sessions of the study, which facilitated the collection of consistent data over a defined period.
• Controlled homogeneity: Selection focused on a single stream to minimize the effects of confounding variables such as main language of study or level in French, while ensuring a certain homogeneity in academic profiles.
• Alignment with the research objective: The study’s target population corresponds to learners in bilingual training, with a particular focus on Physical Sciences. This justifies the choice of a group where mastery of French is a key factor in the evaluation of educational outcomes. Although limited to a single class, the sample designed to meet the following criteria, thus reinforcing its representativeness to the target population:
• Sample size: With 36 participants, the sample exceeds the generally recommended threshold of 30 to ensure minimum statistical validity (central limit theorem). This enables reliable statistical tests to be applied, while making the conclusions generalizable with a reasonable degree of certainty.
• Sampling method: The sample drawn on a random basis from eligible students in the Physical Sciences stream. Although logistical constraints limited the sampling to a particular high school, this choice reflects the relative diversity of learners in a typical educational environment.
The choice to focus on an experimental sample based on the need to control external variables that may influence the results, while directly assessing the impact of the educational intervention within a well-defined framework. This strategy maximizes the internal validity of the study, which is essential for meeting the research objectives.
In short, the methodology adopted for the selection and description of the sample guarantees a solid basis for the statistical analyses and the inferences drawn from them, while taking into account the practical constraints inherent in experimental research in a school setting.
These phases reflect a structured methodological progression, from the observation of the limits of traditional approaches to the design, implementation and evaluation of an innovative solution.
To examine the limitations of traditional skills assessment methods (diagnostic, formative and summative).
• Application of traditional assessment methods to a course module.
• Analysis of associated constraints, such as teacher workload, lack of personalized follow-up and difficulty in providing appropriate feedback.
Identification of needs for more efficient data management and greater personalization of learning.
Integrate adaptive assessment combining the principles of Universal Design for Learning (UDL) and differentiated pedagogy to meet individual learner needs.
• Development of a structured three-stage approach: Sequencing, Multimedia, and Evaluation.
• Implementation of performance thresholds for automatic orientation of learners to different levels [beginner (B), intermediate (I), advanced (A)]. Figure 1 illustrates the logic for assigning levels to learners. Tables 1 and 2 specify the conditions of access to activities adapted to each level.
• Use of Moodle tools to manage adaptive learning paths and interactive plugins (H5P, LevelUp) to enhance motivation and engagement.
Propose a flexible, personalized assessment model that corrects the shortcomings of conventional approaches.
To evaluate the effectiveness of the adaptive approach and compare its results with those of conventional assessment.
• Analysis of learners’ performance before and after the intervention.
• Application of adapted statistical tests to measure significant differences and validate research hypotheses.
Validation of the potential of adaptive assessment to improve student learning and engagement.
The choice of statistical tests in this study based on methodological considerations designed to guarantee the validity and reliability of the analyses, while respecting the characteristics of the data collected. Each test used was selected according to specific criteria, including data distribution, sample size and the type of comparison envisaged.
For variables whose distribution is not suitable for normality (written and oral production), use of the Wilcoxon signed ranks test is justified. This non-parametric test is particularly suitable when:
• The sample size is limited or moderate and the normality assumption is not met.
• The data are matched (comparison between a pre-test and a post-test for the same individuals).
• The scale of the data is at least ordinal, which is the case here where performance is recorded on scales that can be classified but not necessarily at intervals.
• The Wilcoxon test, unlike parametric tests such as the t-test, does not rely on strict assumptions about the distribution of the data. It relies on the ranking of differences between pairs of measurements, enabling the detection of significant variations in conditions where parametric methods could lead to biased conclusions.
The Student’s t-test was used for variables meeting the normality hypothesis (text study, language activity and global control). This parametric test is preferred in these cases because of its superior power, which enables even modest differences between two means to be detected. Its robustness, combined with the applicability confirmed by normality tests, makes it an ideal tool for analyzing the effects of intervention on these variables.
This combination of parametric and non-parametric tests in the analysis reflects a hybrid methodological approach that adapts to the diversity of the data. It guarantees:
• Greater precision: Each type of data is treated appropriately, minimizing analytical bias.
• Comparability: results obtained using different tests remain consistent and interpretable within a global approach.
• Scientific rigor: By using tests adapted to the specific characteristics of the variables, the study avoids the methodological errors associated with the uniform application of inappropriate tests.
In short, adopting this mixed-methods approach makes the most of available data and ensures that conclusions about the effectiveness of educational intervention are scientifically sound and statistically solid.
We used various data collection tools on Moodle, with a particular focus on the notebook. We used Microsoft Excel for data processing and SPSS for statistical analysis of the data. Table 3 shows the Moodle grade book, detailing the data named “Moodle Control - Pre-Test,” which represents the grades obtained by participants during the classic skills assessment path, more precisely, the grades obtained at the end of the first module. The data named “Moodle Control - Post Test,” on the other hand, reflect the marks obtained by participants during the adaptive skills assessment path, more specifically the marks obtained at the end of the second module. We used descriptive statistics to extract the characteristics of our sample in the pre-test and post-test measures. The results show a trend where the averages of all the post-test assessment items exceed those of the pre-test, with the exception of the ‘language activity’ item. This result suggests a positive impact of our pedagogical approach on learners’ performance. To scientifically support this preliminary hypothesis, we used inferential statistical significance tests. With a sample of 36 learners, considered sufficiently large (n > 30), the normality test can be neglected without significant consequences, in accordance with the central limit theorem. However, for the sake of scientific rigor, normality tests were carried out. We used the Kolmogorov–Smirnov (K-S) test and the Shapiro–Wilk test simultaneously in SPSS software. For the normality test, the null hypothesis (H0) is formulated as follows: ‘The sample follows a normal distribution’. If the test result is not significant (p-value (Sig.) > 0.05), we accept the null hypothesis. However, the alternative hypothesis (H1) is formulated as follows: ‘The sample does not follow a normal distribution”: “The sample does not follow a normal distribution.” If the test result is significant (p-value (Sig.) < 0.05), we reject the null hypothesis in favor of the alternative hypothesis.
The differences between the pre-test and post-test assessments for ‘text study’, ‘language activity’ and ‘total control’ follow a normal distribution, while those for ‘written production’ and ‘oral production’ do not conform to a normal distribution. A common research question is to determine whether two groups of independent, matched samples differ from each other. Student’s t-test is widely used to compare the means of two independent or matched groups. However, when the samples to be compared are not normally distributed, as is the case for written and oral production data, the Wilcoxon test is recommended as an alternative to the Student’s t test. The results show a highly significant difference (p < 0.01) for text study and total control, demonstrating the effectiveness of the teaching intervention. Written and oral production also showed a significant (p < 0.05), despite the need to use a non-configured test. However, language activity showed no significant difference (p > 0.05), indicating that the intervention had no impact in this area.
This research aims to facilitate the skills assessment process, by adopting a more personalized approach. The study implemented an adaptive assessment path, aligned with the principles of differentiated pedagogy, and those of universal design for learning. The experiment was carried out with 36 s-year Baccalaureate students. The percentages of improvement and non-improvement were calculated from the differences between the pre-and post-test scores in five categories. If the difference was positive, this indicated improvement, if not, no improvement. The results show that 88.9% of participants improved their Text study, 80.6% their Total Control, and 77.8% their Oral Production. On the other hand, the improvement was more modest for Language activity (50%) and Writing (55.6%). The approach adopted in the experiment yielded significant results: this integrated approach improved the accuracy with which skills acquisition was measured, highlighting each learner’s strengths and weaknesses. The approach also makes it possible to adjust assessments to precisely match the specific skills and profiles of each learner. The proposed model has helped to minimize the systemic banknotes often associated with uniform, conventional assessment. The method adopted has proven its effectiveness in alignment with previous findings (Hermino and Arifin, 2020; Shemshack and Spector, 2020). These researchers found that personalizing learning strategies leads to improved academic performance, highlighting the results of our approach. The study illustrates its ability to minimize the biases inherent in the classic assessment process, in alignment with the arguments presented in the study by Loosli (2016) and Endrizzi and Rey (2008). These researchers have revealed the need to improve skills assessment methods, with the aim of overcoming the rigidity of the conventional process. The relevance of our approach also aligns with the study by Cattan (2020), which explores the optimization of learning through adaptability. Our approach optimizes all three types of assessment: Diagnostic assessment, formative assessment and summative assessment. For diagnostic assessment: Moodle performs an algorithmic allocation of learning activities to learners based on an analysis of their academic performance in the diagnostic test. This systematic methodology makes it possible to plan personalized learning, grouping learners according to homogeneous levels of competence for each activity. For formative assessment, Moodle’s classification algorithms dynamically adapt the individual learning path by interpreting data on learner performance and interactions. This method offers an adjusted learning path, specifically designed for each learner. With regard to summative assessment, the approach used reduces the time spent on key teacher tasks: firstly, the approach minimizes the time allocated to printing tests, which is beneficial from an ecological point of view. Secondly, the automation of marking guarantees fairness and facilitates the provision of feedback, once the students’ answers have been submitted to the Moodle platform. As a learning management system, Moodle offers reliable storage and management of assessment activities. This ensures that every submission is secure and traceable, with the aim of maintaining academic integrity. As a result, the proposed adaptive process has not only optimized teachers’ efficiency in relation to assessments, but has also contributed to pedagogical equity, by ensuring that all learners are assessed fairly and in a way that is adapted to their specific needs (Galipeau, 2018). However, the risk of cheating in a digital environment remains a major concern. Students can access unauthorized resources and share answers with each other during assessments. To overcome this risk, reliable monitoring tools dedicated to learning in the digital environment are recommended. In analyzing student scores, it is worth noting that, despite a general trend toward improvement thanks to the application of adaptive assessment methods, disparities were observed. Based on variations in learner progress, some learners showed steady improvement across various skill areas. Others showed fluctuations or no progress at all. These disparities can be explained by a few intrinsic and extrinsic factors, such as learner motivation, since some research suggests that motivation plays a crucial role in learner success. On the other hand, the learning environment, the quality of pedagogical interaction and the social climate are factors that can impact learning outcomes (Miletić et al., 2024). It is in this sense that consideration of these different factors can further refine assessment strategies, while aiming to maximize the inclusion of all learners (Ayan, 2015). The results of our study show that individual learning methods (Chang et al., 2022) or access to adequate educational resources (Li, 2016) vary significantly among learners. Detailed examination of the results by category also reveals that some skills are better integrated than others. Notably superior performance in written and oral production in line with Hoang and Ngoc’s observations (Hoang and Ngoc, 2021), which suggest that mastery of communication skills is higher than other skills. Whereas weaker results in text analysis, may require a reassessment of pedagogical strategies, or consideration of aids to determining appropriate learning styles. This highlights the need to rethink diagnostic assessment to take account of students’ preferences and learning styles, and to rationalize access to resources. The results of this study open the way to designing a more refined and holistic skills assessment process, enabling pedagogical methods to be adjusted in a more targeted way.
The aim of this study was to examine the impact of the synergistic integration of differentiated pedagogy and Universal Design for Learning (UDL), on the improvement of assessment methods, within an e-learning platform (Moodle LMS). Adopting a mixed research methodology, combining a quantitative and qualitative approach, we examined the effectiveness of an adaptive pathway in skills assessment. The results showed that this integrated approach facilitates a more personalized assessment of skills, enabling a better understanding of learners’ individual needs. By tailoring assessments to each learner’s skills, the adaptive pathway reduced the potential biases inherent in uniform assessment for all learners. However, despite an overall improvement in performance, disparities were observed, underlining the importance of taking into account factors such as motivation and individual learning styles. Analysis of the results also revealed differences in the assimilation of skills, underlining the need to re-evaluate the pedagogical strategies employed. In conclusion, this research highlights the positive impact of an integrated approach to assessment on learners’ learning, while underlining the importance of taking account of learners’ individual needs and abilities in the assessment process. In perspective, the exploration of online motivational factors, specifically in various teaching contexts, and the proposal of a tool providing learning resources adapted to each learner, represent promising avenues for further enriching assessment methods and promoting a more effective and rewarding learning experience for all learners.
The results of this study lead to several strategic recommendations. Firstly, it is crucial to develop advanced digital tools designed to adapt learning resources to the individual needs of learners, thus ensuring that the principle of Universal Design for Learning aligns with those of differentiated pedagogy. It is also recommended that researchers undertake longitudinal and replicative studies to consolidate our understanding of the long-term impact of our approach. Finally, we encourage studies on the challenge of professionalizing teachers in the face of learner diversity and changes in the world of education.
The datasets presented in this study can be found in online repositories. The names of the repository/repositories and accession number(s) can be found in the article/Supplementary material.
This study was carried out as part of an educational experiment with high school students belonging to the Regional Academy of Education and Training (AREF) in Rabat. A request for approval was submitted to the AREF in Rabat, and authorization was obtained before the experiment began. The study complied with the ethical guidelines governing research in the field of education, in particular with regard to the protection of students’ personal data. The data collected (the grades obtained by students on the Moodle platform) were anonymized before being analyzed and published. No names, personal identifiers or any other information that could identify the participants were disclosed, thus guaranteeing confidentiality and the protection of their privacy.
MM: Conceptualization, Formal analysis, Funding acquisition, Methodology, Project administration, Writing – original draft, Writing – review & editing. ME: Conceptualization, Visualization, Writing – original draft, Writing – review & editing. LL: Resources, Validation, Writing – original draft, Writing – review & editing. SF: Data curation, Visualization, Writing – original draft, Writing – review & editing. KM: Supervision, Visualization, Writing – original draft, Writing – review & editing.
The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.
We would like to express our deep gratitude to Professor Salima Chbihi Kaddouri and her students for their essential contribution to the realization of the experiment described in this study. Their commitment was an invaluable asset to our research.
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
The Supplementary material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2025.1498233/full#supplementary-material
Al-Azawei, A., Parslow, P., and Lundqvist, K. (2017). The effect of universal Design for Learning (UDL) application on E-learning acceptance: a structural equation model. Int. Rev. Res. Open Distributed Learn. 18:2880. doi: 10.19173/irrodl.v18i6.2880
Almumen, H. A. (2020). Universal Design for Learning (UDL) across cultures: the application of UDL in Kuwaiti inclusive classrooms. SAGE Open 10:215824402096967. doi: 10.1177/2158244020969674
Alves, G. R., Viegas, M. C., Marques, M. A., Costa-Lobo, M. C., Silva, A. A., Formanski, F., et al. (2013). Impact of different Moodle course designs on students’ performance. Int. J. Eng. Pedagogy 3:18. doi: 10.3991/ijep.v3iS2.2397
Ayan, E. (2015). Moodle as builder of motivation and autonomy in English courses. Open J. Modern Linguist. 5, 6–20. doi: 10.4236/ojml.2015.51002
Babo, R., Babo, V., Suhonen, J. L. T., and Tukiainen, M. (2020). E-assessment with multiple-choice questions: a 5 year study of students’ opinions and experience. J. Inform. Technol. Educ. 19, 001–029. doi: 10.28945/4491
Banes, D., Hayes, A., Kurz, C., and Kashalnagar, R. (2020). Using information communications technologies to implement universal Design for Learning Reports.
Bondie, R. S., Dahnke, C., and Zusho, A. (2019). How does changing “one-size-fits-all” to differentiated instruction affect teaching? Rev. Res. Educ. 43, 336–362. doi: 10.3102/0091732X18821130
Cattan, O. (2020). L’adaptabilité comme compétence pour les systèmes de dialogue orientés tâche. Conference communication.
Chang, Y.-C., Li, J.-W., and Huang, D.-Y. (2022). A personalized learning service compatible with Moodle E-learning management system. Appl. Sci. 12:3562. doi: 10.3390/app12073562
Cook, S. C., and Rao, K. (2018). Systematically applying UDL to effective practices for students with learning disabilities. Learn. Disabil. Q. 41, 179–191. doi: 10.1177/0731948717749936
Delaney, T. A., and Hata, M. (2020). Universal Design for Learning in assessment: supporting ELLs with learning disabilities. Latin Am. J. Content Lang. Integrated Learn. 13, 79–91. doi: 10.5294/laclil.2020.13.1.5
Evardo, O. J. Jr., and Itaas, E. C. (2024). Enhancing students’ performance on least taught topics in basic calculus through Moodle-based courseware package. J. Math. Sci. Teacher 4:em060. doi: 10.29333/mathsciteacher/14310
Galipeau, L. (2018). Impact des applications en salle de classe de la conception universelle de l’apprentissage sur le français écrit en français langue seconde. Report.
Gamage, S. H. P. W., Ayres, J. R., and Behrend, M. B. (2022). A systematic review on trends in using Moodle for teaching and learning. Int. J. STEM Educ. 9:9. doi: 10.1186/s40594-021-00323-x
García-Campos, M.-D., Canabal, C., and Alba-Pastor, C. (2020). Executive functions in universal design for learning: moving towards inclusive education. Int. J. Incl. Educ. 24, 660–674. doi: 10.1080/13603116.2018.1474955
Hehir, T. (2002). Eliminating ableism in education. Harv. Educ. Rev. 72, 1–33. doi: 10.17763/haer.72.1.03866528702g2105
Hermino, A., and Arifin, I. (2020). Contextual character education for students in the senior high school. European J. Educ. Res. 9, 1009–1023. doi: 10.12973/eu-jer.9.3.1009
Hoang, V. N. N., and Ngoc, T. H. (2021). The Effects of Integrating Listening and Speaking Skills Into Moodle-Based Activities. Proceedings of the 17th International Conference of the Asia Association of Computer-AssistedLanguage Learning (AsiaCALL 2021). 533.
Kaiss, W., Mansouri, K., and Poirier, F. (2023). Pre-evaluation with a personalized feedback conversational agent integrated in Moodle. Int. J. Emerg. Technol. Learn. 18, 177–189. doi: 10.3991/ijet.v18i06.36783
Kennel, S., Guillon, S., Caublot, M., and Rohmer, O. (2021). La pédagogie inclusive: représentations et pratiques des enseignants à l’université. La nouvelle revue Éducation et société inclusives 2, 23–45. doi: 10.3917/nresi.090.0023
Kusumaningsih, S. N. (2021). Incorporating inclusive assessment principles and universal design for learning in assessing multicultural CLASSROOM: an AUTOETHNOGRAPHY. LLT J. 24, 337–348. doi: 10.24071/llt.v24i2.3656
Li, Y. W. (2016). Transforming conventional teaching Classroom to learner-Centred teaching Classroom using multimedia-mediated learning module. Int. J. Inform. Educ. Technol. 6, 105–112. doi: 10.7763/IJIET.2016.V6.667
Loosli, C. (2016). Analyse du concept « approche par compétences. Recherche en soins infirmiers 124, 39–52. doi: 10.3917/rsi.124.0039
Miletić, Đ., Jadrić, I., and Miletić, A. (2024). Influence of motivation climate on service-learning benefits among physical education students. European J. Educ. Res. 13, 1031–1041. doi: 10.12973/eu-jer.13.3.1031
Odier-Guedj, D., Lefèvre, L., and Boisvert Hamelin, M.-E. (2023). Valeurs et modalités de mise en œuvre de la conception universelle de l’apprentissage dans les pays de langue française: Une étude de portée. La nouvelle revue Éducation et société inclusives 97, 133–155. doi: 10.3917/nresi.097.0133
Rogers-Shaw, C., Carr-Chellman, D. J., and Choi, J. (2018). Universal Design for Learning: guidelines for accessible online instruction. Adult Learn. 29, 20–31. doi: 10.1177/1045159517735530
Safsouf, Y., Mansouri, K., and Poirier, F. (2020). An analysis to understand the online learners’ success in public higher education in Morocco. J. Inform. Technol. Educ. Res. 19, 87–112. doi: 10.28945/4518
Shemshack, A., and Spector, J. M. (2020). A systematic literature review of personalized learning terms. Smart Learn. Environ. 7:33. doi: 10.1186/s40561-020-00140-9
Smale-Jacobse, A. E., Meijer, A., Helms-Lorenz, M., and Maulana, R. (2019). Differentiated instruction in secondary education: a systematic review of research evidence. Front. Psychol. 10:2366. doi: 10.3389/fpsyg.2019.02366
Van Geel, M., Keuning, T., Frèrejean, J., Dolmans, D., Van Merriënboer, J., and Visscher, A. J. (2019). Capturing the complexity of differentiated instruction. Sch. Eff. Sch. Improv. 30, 51–67. doi: 10.1080/09243453.2018.1539013
Wormeli, R. (2023). Fair isn’t always equal: Assessment & Grading in the differentiated Classroom. 2nd Edn. New York: Routledge.
Keywords: adaptive learning, differentiated teaching, skills assessment, TIC, UDL
Citation: Machkour M, El Jihaoui M, Lamalif L, Faris S and Mansouri K (2025) Toward an adaptive learning assessment pathway. Front. Educ. 10:1498233. doi: 10.3389/feduc.2025.1498233
Received: 18 September 2024; Accepted: 14 January 2025;
Published: 04 February 2025.
Edited by:
Yurena Alonso, University of Zaragoza, SpainReviewed by:
Valerie Harlow Shinas, Lesley University, United StatesCopyright © 2025 Machkour, El Jihaoui, Lamalif, Faris and Mansouri. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Mounia Machkour, bW91bmlhLm1hY2hrb3VyLWV0dUBldHUudW5pdmgyYy5tYQ==
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
Research integrity at Frontiers
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.