AUTHOR=Kubsch Marcus , Czinczel Berrit , Lossjew Jannik , Wyrwich Tobias , Bednorz David , Bernholt Sascha , Fiedler Daniela , Strauß Sebastian , Cress Ulrike , Drachsler Hendrik , Neumann Knut , Rummel Nikol TITLE=Toward learning progression analytics — Developing learning environments for the automated analysis of learning using evidence centered design JOURNAL=Frontiers in Education VOLUME=7 YEAR=2022 URL=https://www.frontiersin.org/journals/education/articles/10.3389/feduc.2022.981910 DOI=10.3389/feduc.2022.981910 ISSN=2504-284X ABSTRACT=
National educational standards stress the importance of science and mathematics learning for today’s students. However, across disciplines, students frequently struggle to meet learning goals about core concepts like energy. Digital learning environments enhanced with artificial intelligence hold the promise to address this issue by providing individualized instruction and support for students at scale. Scaffolding and feedback, for example, are both most effective when tailored to students’ needs. Providing individualized instruction requires continuous assessment of students’ individual knowledge, abilities, and skills in a way that is meaningful for providing tailored support and planning further instruction. While continuously assessing individual students’ science and mathematics learning is challenging, intelligent tutoring systems show that it is feasible in principle. However, the learning environments in intelligent tutoring systems are typically not compatible with the vision of how effective K-12 science and mathematics learning looks like. This leads to the challenge of designing digital learning environments that allow for both – meaningful science and mathematics learning and the reliable and valid assessment of individual students’ learning. Today, digital devices such as tablets, laptops, or digital measurement systems increasingly enter science and mathematics classrooms. In consequence, students’ learning increasingly produces rich product and process data. Learning Analytics techniques can help to automatically analyze this data in order to obtain insights about individual students’ learning, drawing on general theories of learning and relative to established domain specific models of learning, i.e., learning progressions. We call this approach Learning Progression Analytics (LPA). In this manuscript, building of evidence-centered design (ECD), we develop a framework to guide the development of learning environments that provide meaningful learning activities and data for the automated analysis of individual students’ learning – the basis for LPA and scaling individualized instruction with artificial intelligence.